>> email builder wrote:
>>>   As a result of this, however, we are currently burdened with an
>>> 8GB(! yep, you read it right) bayes database (more than 20K users
>>> having mail delivered).
>> 
>> Consider using bayes_expiry_max_db_size in conjunction with
>> bayes_auto_expire
> 
> "Using"?  So you are saying you use non-sitewide bayes but you limit
> your max DB size to something much smaller than the default?  Care to
> share your settings?

No, I use sitewide bayes.

> We left these at their defaults (not unintentionally).  If we have
> 20K users, the default max of 150,000 tokens at roughly 8MB comes out
> to 160GB.  We have the disk space, but just not sure if we have the
> tuning it would take to handle a DB of that size.  What I am looking
> for is tuning help or other ideas on how to achieve some reasonable
> level of bayes personalization without drowning our DB resources.

For optimum performance you probably want the bayes database to fit into RAM, 
along with all of your spamassassin objects and anything else on the server.

You might consider buying a dedicated Bayes DB server with 4 GB of RAM, and 
cutting bayes_expiry_max_db_size in half.  That should do it.

If the DB fits into RAM, the SQL engine should be able to make transactional 
changes in RAM and lazily spool them to the disk without forcing other 
transactions to wait.

-- 
Matthew.van.Eerde (at) hbinc.com               805.964.4554 x902
Hispanic Business Inc./HireDiversity.com       Software Engineer

Reply via email to