On 02/07/2014 03:04 AM, Kevin A. McGrail wrote:
On 2/6/2014 8:32 PM, Dave Warren wrote:
On 2014-02-06 17:17, John Hardin wrote:
On Thu, 6 Feb 2014, Kevin A. McGrail wrote:

I've discussed it with Alex a bit but one of my next ideas for the
Rules QA process is the following:

- we measure and report on metrics for the rules that are promoted
such as rank (existing), computational expense, time spent on rule.

I assume meta rules would combine the expense of their components?

Sounds interesting!


How about if one or more components were called more by more than one
meta-rule? It's perhaps not entirely fair to divide it evenly, since
that might imply that removing the metarule would kill off that CPU
usage.
Without triple checking the code, my 99.9% belief is Rules are cached.
Calling them multiple times does not trigger a re-check.

duplicate rules only get loaded once, it "only" costs time/cpu cycles so the fewer duplicates we have the faster we start a spamd or load rules when running spamasassin.

see begimning of output when during "spamassassin --lint -D rules"


Reply via email to