Hi,

I am running spamd/spamc 2.61 on a relatively small server (300 email
addresses) but which is nonetheless, rather busy. I have been using a
global bayes db so everyone can take advantage of the learned spam and ham
that I feed it. This seems to be working well although occasionally, I
notice bayes.lock files getting stuck in there for a short amount of time.

My plan was to rollout this setup on a large cluster of 20 mailservers all
accessing a global bayes db shared on a NetApp filer. This cluster is
setup and running qmail quite nicely with 100,000 accounts and 2,500,000
emails/day. Is running 150 spamd proccess per box (=300 spamd processes)
going to continuously lock the bayes db file to the point of hopelessly
breaking? Or will the spamd processes share it nicely (seems unlikely).
Has anyone done anything like this on this scale?

Andreas


-------------------------------------------------------
This SF.net email is sponsored by: IBM Linux Tutorials.
Become an expert in LINUX or just sharpen your skills.  Sign up for IBM's
Free Linux Tutorials.  Learn everything from the bash shell to sys admin.
Click now! http://ads.osdn.com/?ad_id=1278&alloc_id=3371&op=click
_______________________________________________
Spamassassin-talk mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/spamassassin-talk

Reply via email to