On Tue, 2006-05-02 at 15:50 -0500, Igor Chudov wrote: > On Tue, May 02, 2006 at 01:39:26PM -0700, List Mail User wrote: > > >... > > >For the last week, I feel like I should receive a paycheck from Geocities! > > >All I've been doing is submitting damn redirect web pages. I even did some > > >testing and found some sites listed in NANAS as far back as 5 days that > > >were > > >still active. > > > > > >The source code for these pages use at most 3-4 different techniques. Not > > >very hard to filter for on new pages. Hell, I think 100% of the redirected > > >URLs were listed in URIBL black!! Every freaking morning I see more > > >geocities redirects. Whatever they are doing, could be a lot better. > > > > > >Checking on ones from Sunday, I see they are still running, even after > > >being > > >reported. At this rate, the geocities redirect are lasting longer then new > > >domains. > > > > > >Chris Santerre > > >SysAdmin and SARE/URIBL ninja > > >http://www.uribl.com > > >http://www.rulesemporium.com > > >... > > > > Even worse, they will close a site, then another site with exactly > > identical content will appear (probably created at the same time). To > > create > > their own blacklist of already nuke'd sites seem pretty trivial. And also > > the use of Yahoo! sites for hosting spammer images, where the directories > > under the root remain constant seems another easy case to have wiped out, > > but they haven't. In their favor, it seems that Yahoo! is now the second > > largest source of child pornography in the world, down from #1 because so > > many of the sites are now hosted on zombies (but often advertised via sites > > on Geocities that redirect to them). > > It is not so simple. I looked at the source code of these spammers' > websites. They are made with very obfuscated javascript. Not very easy > to recognize programmatically. A question remains open, why allow > javascript on geocities, but that seems to be a business issue for > them. > > Still, that the pages stay up so long after they were spammed, is > suggestive that they are not using spam traps. > > > I do not think that they could fully eliminate all spammy pages, but > they can make themselves a very unattractive target for spammers. Just > doing the following: > > 1) not allowing javascript > 2) using intelligent filters on website URLs in links > 3) using spam traps > 4) Allowing craigslist style "this is spam" button, feeding item 2). >
5) cronjob a process which will download and check every website that is changed and check the source for spam signs and the page which is displayed. even plain old wget has this options to show the outputted html. My 5 cents: geocities gets enough money from displaying these sites and have a name which will keep them from getting world wide blacklisted. Now it is only on peoples local blacklist but I like the idea for a world wide (URIBL, SURBL,...) blacklist entry. Maybe a poll on the major blacklist sites could help. When could we change "big and with a lot of good guys" into "big and with a lot of good guys and to much bad guys, so sorry for the few good guys" -- with kind regards, Maurice Lucas TAOS-IT