On 02/10/14 10:17, Axb wrote: > > have you tried "-L forget" before "-L spam" ? > > sa-learn --dump magic before and after learning show show a > difference...
I didn't do a "forget" before - I'll remember that, thanks. As far as "before, after" goes for the dump - not an option. We're receiving 6-12 messages per second, "--dump magic" is *always* different :-) However, it's been 7 hours since I sent my first email and now the same message is BAYES_20 - so it is "learning" something - just took longer than I was used to I guess. We use site-wide SA and don't really hand-feed the bayes (too hard for our users: Exchange backends, SA frontend), so there is over 200% more nspam tokens than nham - could that cause a problem? sa-learn --dump magic 0.000 0 3 0 non-token data: bayes db version 0.000 0 3436572 0 non-token data: nspam 0.000 0 1475976 0 non-token data: nham 0.000 0 0 0 non-token data: ntokens 0.000 0 0 0 non-token data: oldest atime 0.000 0 0 0 non-token data: newest atime 0.000 0 0 0 non-token data: last journal sync atime 0.000 0 0 0 non-token data: last expiry atime 0.000 0 0 0 non-token data: last expire atime delta 0.000 0 0 0 non-token data: last expire reduction count -- Cheers Jason Haar Corporate Information Security Manager, Trimble Navigation Ltd. Phone: +1 408 481 8171 PGP Fingerprint: 7A2E 0407 C9A6 CAF6 2B9F 8422 C063 5EBB FE1D 66D1