Guy Hulbert wrote:
However:

~/tmp/dlog-0.9.8b$ wc *.[lyh12]
 4598  11903 117593 total

wc qmailmrtg7-4.2/*.*
 1077  2769 28913 total

wc tinydns-rrd-0.50/tinydns-rrd
 345  1058 10014 tinydns-rrd-0.50/tinydns-rrd

Obviously the line counts are not a completely fair comparison but it
seems that the tools are progressively less accessible (perl, C, lex
+yacc) and the code seems to become *more* complex (though each tool
covers more different logs).
That's not really fair as dlog consists of analyzers for many other tools. The 
data-collecting tools can be used directly to feed mrtg for instance:

$ wc dlogtiny.*
   197     498    4216 total

$ wc dlogqmail.*
    195     488    3889 total

But the prize winner will still be the dodlog.pl script that creates the graphs:

$ wc dodlog.*
   1804    5324   64631 total

(Approx. 1/3 is definitions for the RRDtool databases that could exist in a 
config-file maybe)

- but again that script handles all the different types of log files, but it 
does not have to be executed on the same hosts as the data-collecting tools. 
Many largescale installations produce GB's of logfiles, and the small binary 
file that actually searches through the logfile puts minimum strain on the 
production hosts (but of course it needs to traverse through the data once). 
Then you can have your way with the output (produce graphs, or pass the info to 
munin) at another host.


Is it really necessary to have a different program for each log or is it just 
easier to do it that way ?
Many of the tools produce output that is very similar (I think axfrdns and tinydns is almost identical), so it's difficult to make one pattern to rule them all.


-Skaarup

Reply via email to