This seems strange:

> ####Here is sa-learn --dump magic:
> This shows that I have more than enough spam and ham
> 0.000          0          3          0  non-token data: bayes db version
> 0.000          0       3754          0  non-token data: nspam
> 0.000          0        220          0  non-token data: nham
> 0.000          0     312279          0  non-token data: ntokens
> 0.000          0 1051829432          0  non-token data: oldest atime
> 0.000          0 1135374012          0  non-token data: newest atime
> 0.000          0          0          0  non-token data: last journal sync
> atime
> 0.000          0 1135373049          0  non-token data: last expiry atime
> 0.000          0          0          0  non-token data: last expire atime
> delta
> 0.000          0          0          0  non-token data: last expire
> reduction count


> [19915] dbg: bayes: DB journal sync: last sync: 1105811470
> [19915] dbg: bayes: corpus size: nspam = 153968, nham = 40588
> [19915] dbg: bayes: score = 4.91619356335349e-10
> [19915] dbg: bayes: DB expiry: tokens in DB: 1984629, Expiry max size:
> 150000, Oldest atime: 1084312293, Newest atime: 0, Last expire:
1098687948,
> Current time: 1135280002
> [19915] dbg: bayes: DB journal sync: last sync: 1105811470

As I read that, the bayes db has 3754 spam and 220 ham.
But later in processing it has 153968 spam and 40558 ham!

This makes me think you have two different bayes databases under two
different users.  Which would perhaps imply different user_prefs files, and
one of them might not be enabling bayes.

        Loren

Reply via email to