On Sun, 31 Jul 2016, Robert Schetterer wrote:
Greylisting was invented as an idea against bots. Its based on the idea
that bots "fire and forget" when they see a tmp error and dont get back.
But thats historic, bots are recoded, better antibot tecs were invented.
The only problem now is people still believe in historic stuff.
This argument ignores two important facts.
First, even if 98% of bots and viruses (and that number is pure
conjecture on my part) are now smart enough to retry, that doesn't
change that greylisting is a just about the lowest "cost" way of
preventing the ones that aren't smart enough (or aren't designed to
retry because they want to push the most amount of junk at the
lowest-hanging fruit).
Second, the ability of a bot, virus, server or any other spam source
to retry delivery after a temp failure is not the only "weakness"
greylisting takes advantage of. A spam source might not get past my
greylist for any number of reasons, including the classic case of poor
coding/design, but also:
- It is detected and blocked (or taken offline) by the source network
before its greylist period is up
- It make use of a compromised account, and that account is disabled
or secured before its greylist period is up
- It is part of a distributed botnet, so subsequent attempts come from
a different IP/network
- It sends a high volume of spam, so it doesn't come back around to
retry again until after its entry has been removed, requiring
a whole new greylisting period
Others could probably add to that list, but that's just off the top of
my head. But, even if a spam source retries and successfully makes it
past the greylisting, the greylisting still provides potential
benefits, like:
- While it was waiting to retry, its IP has been added to BLs, which
my other filters will score appropriately
- While it was waiting to retry, the phishing URL in it has been
reported and taken down (or the URL shortener link it used has been
removed)
- While it was waiting to retry, the virus it carries has been
identified and pushed out to my virus definitions
- While it was waiting to retry, its registered domain has been
removed
- While it was waiting to retry, others who received the spam have
reported it to services like Razor and DCC, which other filters will
act on
- If it has to keep retrying addresses to my server, I'm consuming
resources (however minimally) that could be used to send their junk
to others
Again, I'm sure others could add more based on their experiences.
I'm not saying greylisting is without problems, that it just works out
of the box (initial and ongoing configuration is critical), or that
everyone should be using it, but there's a lot more going on here than
just outwitting poorly written bots.
--
Public key #7BBC68D9 at | Shane Williams
http://pgp.mit.edu/ | System Admin - UT CompSci
=----------------------------------+-------------------------------
All syllogisms contain three lines | sha...@shanew.net
Therefore this is not a syllogism | www.ischool.utexas.edu/~shanew