>>> "Benny Pedersen" <m...@junc.org> 06/28/09 12:42 AM >>> >On Sun, June 28, 2009 05:38, Cory Hawkless wrote: >> I agree, wouldn't it be easier to uniformly feed all of these type of URL's >> though the already existing SA filters. As Jason suggested maybe by >> collapsing whitespaces? > >lets redefine how a url is in the first place ? > >www localhost localdomain >www.localhost.localdomain > >one of them does not work :) > >spammers more or less just use the first one, so what ? > >> Sounds like the obvious solution to me? Any problems with this? If not how >> can it be done? > >just show a working ReplaceTags for spaces, and then all can be solved to make >rules with how spaces can rebuild into no spaces, >eg in my above example " " will be "." and then sa see the last url and first >url > >imho this is what replacetags does > >but as long webbrowsers does not work on both, is it a big problem so ?
It is folly to underestimate the stupidity and/or gullibility of humans. Just because the link "won't work" as-is in the message does NOT mean people out there won't retype it, corrected, into their browser address box. It is my opinion that if the spammers weren't getting traffic to the websites from the email, they would stop sending the email. Since the emails continue, we must presume that they are having some success in attracting victims to the sites. Therefore, the URL obfuscation by omitting the dots seems to be a viable spam indicator. The tricky part is in figuring out how to detect this trait reliably without tripping over other similar traits that are not good spam indicators.