Hi Jason, Q-S List,

> > But you are saying that these orphaned  |-perl5.8.0 processs are normal?
> No - I said the qmail-queue.log stuff you sent through all looked normal -
I
> didn't see anything that was causing the hangs you are seeing.

OK understood.


> why is it calling reformime after it has failed?

It doesn't. If you are referring to the "no sender or recip found" messages
- Q-S exits after that. They are not causing the hangs...


> > Linux Slackware 8.1 SMP Linux gilliam 2.4.21 #1 SMP Sat Jun 21 19:21:56
GMT 2003 i686 unknown
> That's a really new kernel... Let's go back to basics - when did this
> problem start? Has this ever worked 100% for you? Or did these problems
> occur after you upgraded to 2.4.21?

Well the server was installed with Slack 8.1 which is 2.4.18 - this was
working fine under all of this until I found the failed disk in the array
that day.  That weekend I took the disk out of the array and scanned it
using badblocks, found nothing so put it back in the array.  On that night I
also upgraded to 2.4.20 and installed the latest Sophie 3, SA and (maybe)
Qmail-scanner from 1.15 to 1.16 - after that 2.4.20 upgrade the box would
actually spiral and die when things got resource heavy.  I upgraded this to
2.4.21rc8 and things became more stable again (I found something about a
deadlock with SMP and eepro100 NICs in the changelog) and then reupgraded to
the full 2.4.21 2 nights ago out of desperation.


> > OK, but orphaned processes off init ?  :\
> Well that's what we're trying to discover :-)

It is 5am and the server is quiet.

I just dumped 20 lots of a normal 75k email at myself and my load went to
2.2 with no orphaned processes
then I dumped 20 lots of a mail with no recip and sender in and the load
went to 2.3 with a load of orphaned processes
qll the processes "lasted" the same amount of time in both cases.

Is this how it should have behaved?  I am still not clear on whether they
should ever get orphaned like that?


> No - I mean you said you had a disk problem - did you or didn't you?

I did, I found (F) by a disk in mdstat and a few seek errors in the logs so
took the disk out of the array and badblock'd it for the w/e then readded it
in.


> > Is it just totally normal to have these orphaned processes and I have
been
> > barking up a tree for no reason?
> Oh no - you do have a problem alright...
> OK, so it looks like reformime is hanging on you there.

OK good.


> Your logs say:
> 22/06/2003 08:53:26:15752: d_m: starting
> usr/local/bin/reformime  -x/var/spool/qmailscan/gilliam105627200642615752/
> </var/spool/qmailscan/working/new/gilliam105627200642615752
> Can you check that's correct? I mean it says:
> "starting usr/local/bin/reformime"
> instead of:
> "starting /usr/local/bin/reformime"
> why is that?

I just checked that old log now

22/06/2003 08:53:26:15752: d_m: starting
usr/local/bin/reformime  -x/var/spool/qmailscan/gilliam105627200642615752/
</var/spool/qmailscan/working/new/gilliam105627200642615752
[1056272007.23375]
22/06/2003 08:53:26:15752: d_m: finished
usr/local/bin/reformime  -x/var/spool/qmailscan/gilliam105627200642615752/
[1056272007.24763]

and have checked the current log too and that looks OK.

23/06/2003 03:51:13:10984: d_m: starting
usr/local/bin/reformime  -x/var/spool/qmailscan/gilliam105634027342610984/
</var/spool/qmailscan/working/new/gilliam105634027342610984
[1056340279.10731]
23/06/2003 03:51:13:10984: d_m: finished
usr/local/bin/reformime  -x/var/spool/qmailscan/gilliam105634027342610984/
[1056340279.12193]


> Secondly, have you tried recompiling maildrop on 2.4.21 to see if that's
the
> problem?

Yes, I am running maildrop 1.5.3 now just freshly recompiled from source and
still the same orphaned processes.


> I must say I'm a bit lost. You apparently have a working system that just
> turns to custard for no apparent reason - hence me continually asking you
> about that disk error you reported in your very first message... This
looks
> like a hardware problem

OK - is there anyway I can triple check this :\ ?


> One thing you've never said is if this has ever worked OK for you - or is
> this the first time you've had a go an AV scanning on your Qmail servers?

Yeah this worked OK for a while before any disk problems showed up (why do I
have a feeling that answers my own question)... it was the night of popping
the disk back in and upgrading everything that has got me into this.


> In your original msg you said these were RAID-1 IDE disks - I wonder just
> how well they'll handle the load. As the Q-S docs say, running AV software
> over each and every mail message adds a HUGE load to a box. Another AV
SMTP
> product I know of says that it takes 10x the hardware to deal with the
same
> amount of e-mail once you turn AV scanning on - so if your boxes were >10%
> busy before you enabled Q-S - you may indeed find the box isn't up to the
> load.
>  RAID-1 means write every file TWICE - once to each disk. Q-S does lots
of>
> O while scanning,etc...

It is RAID 1 ata 100, dual piii 1.4s and 2 gig of ram.  Would vmstat or
something similar show how busy the IO is and how resource intensive it is?


> I run SCSI-based servers that handle a similar load - and I MADE SURE I
put
> the /var/ partition onto a non-RAIDed disk for these very reasons. Oh
yeah -
> and we have TWO of those boxes - not one...

2 boxes that spec?  This still doe the orphaned processes it if I take off
sophie and spam assassin, shouldn't that cut the load alot with only stock
qmail-scanner on it?  It is only these no recip no sender emails that cause
the orphaned processes AFAIK.


> In the end - I don't know. I don't have enough information about your
> environment and history to know whether this is a new problem, or a
problem
> that has occurred since you started using Q-S...

Any other info you need please do just ask, apologies if I have missed
anything (its 5.08am :|).

Thanks for your help with this, apologies to those not interested.



Regards
Pete




-------------------------------------------------------
This SF.Net email is sponsored by: INetU
Attention Web Developers & Consultants: Become An INetU Hosting Partner.
Refer Dedicated Servers. We Manage Them. You Get 10% Monthly Commission!
INetU Dedicated Managed Hosting http://www.inetu.net/partner/index.php
_______________________________________________
Qmail-scanner-general mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/qmail-scanner-general

Reply via email to