Probably better than the "spam phrases" approach would be the database
approach as currently used for white/black listing.
Any way to tie that to an XML retrieval from a list of central repositories?
Does mySQL do replication? A properly done XML would let us eyeball the list
as well as use it to keep the database up to date.
Another idea: could we synthesize an RBL so that
http://www.spammer.com/spam/web/bug/ becomes spam.web.bug.x.www.spammer.com
for a reverse lookup? It is going to get tricky, how to specify a randomized
intermediate directory?

-----Original Message-----
From: Daniel Rogers [mailto:[EMAIL PROTECTED]] 
Sent: Monday, September 30, 2002 7:18 PM
To: [EMAIL PROTECTED]
Subject: Re: [SAtalk] URL blacklist


On Mon, Sep 30, 2002 at 04:09:48PM -0500, SpamTalk wrote:
> Shouldn't a list such as this this be a part of the next release in 
> the same manner as frequent spam phases?

I'm happy to provide my list, either for just a couple people, or for
inclusion in the distro.

The only problem is that there are a lot of domains in the list, and I'm
having to add more and more every day.  So, it would be good at first, but
there's always another domain to add!

There's also some phone number and other rules in there, so take whatever
you like.

I'll attach the rules to this email as a zip file so they don't trigger SA. 

Dan.


-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
_______________________________________________
Spamassassin-talk mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/spamassassin-talk

Reply via email to