Question : Google crawling errors, callingpage javascript - Safe to disallow file in robots.txt?

I'm getting a bunch of errors in our crawl report for an intermediary file that calls another product page file.

The url looks like:
http://mydomain.com/psz.php?cPath=142&products_id=595&callingpage=product_infoI.php, so callingpage is calling the final page.

Can I set this in my robots.txt without issues?:
Disallow
/psz.php

Thanks.

Answer : Google crawling errors, callingpage javascript - Safe to disallow file in robots.txt?

A workaround I have used when I have had issues with the Netscreen client (earlier version) is to identify the services and then have a batch file (on the desktop) to restart the services.

So for example:
NET STOP "safenet ike service"
NET STOP "safenet monitor service"
NET START "safenet monitor service"
NET START "safenet ike service"

Once the services are restarted, the client application will work properly and it save the pain of rebooting.

... Thinkpads_User
Random Solutions  
 
programming4us programming4us