> -----Original Message-----
> From: Matt Kettler

> from SA 2.55's rules/STATISTICS-set1.txt
> 
> OVERALL%   SPAM%     HAM%     S/O    RANK   SCORE  NAME
>   27.427  51.2148   0.1658    0.997   1.00    2.06  RAZOR2_CHECK
>   13.201  24.7144   0.0074    1.000   0.97    4.40  PYZOR_CHECK
>    6.254  11.6769   0.0384    0.997   0.94    3.02  DCC_CHECK


Have these statistics been fairly true over time or do they fluctuate?  It
seems that the statistics are point-in-time numbers that tell more about the
reported spam rather than the effectiveness of the specific check.  While
the corpus of messages fed to create these statistics may be the same for
all three tests, the corpus of checksums for each check are different as the
pieces of and number of reported spam will be different.  Is this true?  I
hope I articulated my thought correctly.  If my train of thought is correct,
then it would be beneficial to run all three tests rather than just one or
two.  Thoughts?

--Larry




-------------------------------------------------------
This SF.Net email sponsored by: Free pre-built ASP.NET sites including
Data Reports, E-commerce, Portals, and Forums are available now.
Download today and enter to win an XBOX or Visual Studio .NET.
http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01
_______________________________________________
Spamassassin-talk mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/spamassassin-talk

Reply via email to