On Thu, 2013-05-30 at 23:11 -0400, Alex wrote:
> It would be great if there was an automated script that included
> checking surbl.org and others right from a shell script, given a
> domain name.
> 
What exactly are you trying to do? By that I mean, do you want to:

1) test an individual domain name against online blacklists?
   Do this with a script that uses dig to query all the blacklists.

   Obviously this isn't for more than relatively casual use.

2) test all the domains in a file against online blacklists?
   Do this with a wrapper script that calls script (1) against each
   domain in your file.

   However, if this is used often you may exceed usage limits on 
   some blacklists.

3) test all the domains in a file against downloaded blacklist
   files you've retrieved from the blacklist hosts?

   Use a script that downloads blacklists with 'curl' or a scripted ftp
   client. Use 'awk' to convert the files into a common format
   containing just the domain names, combine them and eliminate
   duplicates with 'sort', and do a mass comparison with 'column' to
   produce a sorted list of matches and/or non-matches.   

   This is a bit more work to write, but should run fast and can't break
   a blacklist's online usage limits.

4) what do you want to do if a comparison matches/doesn't match
   a blacklist? 


Martin





Reply via email to