Theo wrote: > In 2.5x, if you're going to do manual expires, change the expiry_count > value to something really large. > > In 2.6x, just do 'bayes_auto_expire 0'. ;)
Okay, so I just set the bayes_expiry_scan_count to 500000 (as Dallas suggested in his previous email, and you do here). No expirations are done automatically, but then according to the sa-learn man page: SpamAssassin runs through every token in the database. If that token has not been used during the scanning of the last `bayes_expiry_scan_count' messages, it is marked for deletion. So if I have 500,000 for bayes_expiry_scan_count, then a token will only be marked for deletion very rarely, as it's unlikely that most of the tokens will *not* have been seen in the last 1/2 million emails. Next, if that operation would bring the number of tokens below the `bayes_expiry_min_db_size' threshold, it removes tokens from the for- deletion list until the resulting database would contain `bayes_expiry_min_db_size' token entries. It then removes the listed tokens and updates the 'last expiry' setting. So this explains what happens if the DB gets too *small*. But what governs how *big* the database can get? Would it just be big enough to contain the last 500,000 seen tokens? Would this be a problem for performance/DB size? thanks!! johnS ------------------------------------------------------- This SF.net email is sponsored by: eBay Get office equipment for less on eBay! http://adfarm.mediaplex.com/ad/ck/711-11697-6916-5 _______________________________________________ Spamassassin-talk mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/spamassassin-talk