On Tue, Apr 16, 2013 at 01:57:55PM -0300, chris derham wrote: > > Or, another way of looking at this would be that for every 40 servers > > scanned without a 404 delay, the same bot infrastructure within the same > > time would only be able to scan 1 server if a 1 s 404 delay was implemented > > by 50% of the webservers. > > This assumes that the scanning software makes sequential requests. > Assuming your suggestion was rolled out (which I think is a good idea > in principal), wouldn't the scanners be updated to make concurrent > async requests? At which point, you only end up adding 1 second to the > total original time? Which kind of defeats it. > > Again I'd like to state that I think you are onto a good idea, but the > other important point is that some (most?) of these scans are run from > botnets. These have zero cost (well for the bot farmers anyway). My > point is even if the proposal worked, they don't care if their herd is > held up a little longer - they are abusing other people > computers/connections so it doesn't cost them anything directly.
Yes. But someone *does* own the botted computers, and their own operations are slightly affected. I have wondered if there is some way to make a bot so intrusive that many more owners will ask themselves, "why is my computer so slow/weird/whatever? I'd better get it looked at. Maybe I should install a virus scanner." If bots were killed at a much higher rate, that *would* affect the botnet masters. I have no idea how to make bots more visible by messing with their attacks, just wondering. Then again, my experience shows that when a computer slows down most people either just live with the problem or buy a faster machine. Ugh. -- Mark H. Wood, Lead System Programmer mw...@iupui.edu Machines should not be friendly. Machines should be obedient.
signature.asc
Description: Digital signature