You are stopping them inside apache now. Next obvious step is a
firewall. Either on the server on a dedicated box in front
of it.

regs,

Christian

On Sat, Dec 15, 2007 at 12:57:17PM -0800, Charles Michener wrote:
> I have a couple of spider bots hitting my server that I do not wish to have 
> access to my pages - they ignore robots.txt, so I finally put them on my 
> 'deny from xxxxx' list. This does deny them access but they persist to keep 
> trying - trying each page address at least 30 times - several hits per second 
> .  Is there a standard method to forward them to some black hole or the FBI 
> or ...?
> 
> Charles
> 
>        
> ---------------------------------
> Be a better friend, newshound, and know-it-all with Yahoo! Mobile.  Try it 
> now.

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: [EMAIL PROTECTED]
   "   from the digest: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to