> > try a couple of dozen connections to the same remote host 
> at the same time.
> > (This is an issue in itself!)
> 
> Why is this an issue?  If the remote host can handle 100 inbound
> connections, you should be able to open 100 connections to 
> them, inject
> your messages, and close the connections.  Everyone's happy.
> 
> If the remote host can't handle that many, it shouldn't accept that
> many.  You'll then get connections past X deferred, and qmail 
> will back
> off.
> 
        It's an issue because, while in an ideal world this would be fine,
we don't live in an ideal world and not every smtp server out there will
drop connections smoothly. Instead, they hang, or accept connections that
they can't handle, leading to a reduced throughput.

        If I'm sending a few thousand mails, chances are it'd be possible to
maintain full throughput without hitting the same host more than once
concurrently. Sure, if there's nothing else in the queue, then you may as
well use multiple threads per MX, but what do you lose by scheduling other
hosts first?

        I've also noticed that if qmail tries to deliver (for example) 50
messages to one host concurrently, perhaps 2 will get through. The rest will
be retried, but unfortunately they tend to get retried at much the same
time. Again, 2 messages get through, and the process repeats. This simply
isn't efficient.

        I think qmail is great, don't get me wrong. I just thing there is
room for improvement.

                Richard

P.S. People here seem to be a little over-sensitive about this issue!

Reply via email to