Hi,

Same problem here with a 1.3G bayes_seen file.

No CPU load linked to this but a too big file is never good...

Can someone help to deal with this ? As long as I remember this problem were
discussed a lot of time here but I never saw   a trick for this

-----Message d'origine-----
De : Richard Smits [mailto:[EMAIL PROTECTED] 
Envoyé : mardi 12 juin 2007 09:30
À : users@spamassassin.apache.org
Objet : How to decrease the bayes database size

Hello,

We realy need some help here. It has come to our attention that our bayes
database is 2.4 GB big. It is really slowing down our servers and they have
a big cpu load.

Now we have tried the trick with the sa-learn --force-expire , and it
deletes a lot of entrys, but the file is not getting any smaller.

79K  Jun 12 09:26 bayes_journal
20M  Jun 12 09:26 bayes_toks
2.5G Jun 12 09:26 bayes_seen*

Does anyone has some tricks to help us out ?

Greetings... Richard Smits

----
0.000          0          3          0  non-token data: bayes db version
0.000          0   14201082          0  non-token data: nspam
0.000          0    7760360          0  non-token data: nham
0.000          0     916962          0  non-token data: ntokens
0.000          0 1181559955          0  non-token data: oldest atime
0.000          0 1181633069          0  non-token data: newest atime
0.000          0 1181633115          0  non-token data: last journal 
sync atime
0.000          0 1181604237          0  non-token data: last expiry atime
0.000          0      43200          0  non-token data: last expire 
atime delta
0.000          0     360013          0  non-token data: last expire 
reduction count

----------------------


Reply via email to