[EMAIL PROTECTED] wrote:
> My concern with killing all of the connections is that I have some
> internal processes that (I think) will occasionally corrupt log and data
> files when this happens.

You may want to reconsider your design, then.  If you are trying to aggregate
multiple log streams into a single file using any form of locking, you are
almost inevitably going to be fighting corruption.  You could use something like
multilog, but that only allows you to post-process the log entries (when the log
file is rolled), which may not be timely enough for your needs (though of course
you could make the logfile size as small as you needed it to be for 
granularity).

One thing to consider would to be a log process outside of qpsmtpd that you can
freely write to, then have your processing happen in a controlled manner in a
third process.  The folks I work with have released the implementation we use
internally:

https://labs.omniti.com/trac/jlog

I'm not necessarily recommending that tool per se (as the documentation is
almost non-existent at the moment), but rather as an example of the style of
programming you probably need to pursue:

        qpsmtpd => logger <= log processor

By decoupling qpsmtpd from the log processing, you can do whatever you want to
qpsmtpd and the log processor still gets clean log entries to analyze.

John

Reply via email to