>From: Vernon Schryver <[EMAIL PROTECTED]>
>
>> ...
>> The basic idea then would be to trace back bad packets that
>> conform to some typically innocent, but occasionally troublesome,
>> profiles.  The profiles will become self-evident with experience,
>> and once people know they will be caught by this traceback
>> system they will think twice before spreading their crap around.
>
>If I were building a DDoS engine today, I'd write a conventional
>(Microsoft) DOS virus that does nothing except once every 3 minutes do
>the equivalent of:
>
>    echo "GET /index.html HTTP/1.0"; echo) | telnet -r $1 80
>
>(maybe instead with a random request instead of /index.html)
>
>After a few 1,000,000 desktops have been infected by familiar virus
>vectors, the victim might notice the traffic.
>How would you filter for them?  Even if you could give routers
>enough processing power, what would you learn from the filtering
>that you'd care to apply?

  It is possible to get more big bang in this case: virus may asks
DNS servers about some generated domain names. Algoritm to fight
negative caching effect is simple. With millions names in .com
there is a long way to keep asking.

                               - Leonid Yegoshin, LY22

Reply via email to