At 04:58 PM 3/13/2013, Jen Rasmussen wrote:

>Have you tried keeping all of your documents in one directory and blocking
>that directory via a robots.txt file?

A spider used by a pirate site does not have to honor robots.txt, just as a 
non-Adobe PDF utility does not have to honor security settings imposed by 
Acrobat Pro. The use of robots.txt would succeed mainly in blocking major 
search engines, which are not the problem.

Dale H. Cook, Member, NEHGS and MA Society of Mayflower Descendants;
Plymouth Co. MA Coordinator for the USGenWeb Project
Administrator of http://plymouthcolony.net  


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to