It really has to do with the nature of the data you are monitoring. I'm monitoring the response time of some COBOL programs that run on a mainframe and are called from an internet application. For every call made to some program, I can compute the response time. I know exactly how many calls are made, when they are made, and how many milliseconds each lasts. I want to plot a graph that shows how quick the programs run when the application is online. A good graph is one that shows the maximum response time in the last 10 seconds, so that I could see if some of the programs gets stuck or is unusually slow. In order to do this, I collect data for 10 seconds, keep the maximum value, and feed it to a rrdtool GAUGE. The value I store in the database IS the maximum value within the 10 seconds, as I have the knowledge of all the data-values at any point in time. I was expecting to be able to feed the computed value to rrdtool at any point in the step, but I was getting incorrect results which led to my first email. Now I just feed the values step-aligned, and everything is fine.
Don't you think my case could be better handled by the tool? Thank you for your answer Cristian Tobias Oetiker wrote: > Cristian, > > the reason for MIN and MAX only recording the 're-binned' values is > that anything else is pure randomnes, assuming your queries are not > 100% equidistant ... this especially so with counters. For gauge > datasources min and max are illusions anyway, as you are > reading the data at some point in time and have no knowledge of the > data-values at any other point ... > > so your quest should not be trying to change the way min/max > behave, but rather understanding why it works the > way it works ... > > cheers > tobi > > > -- Unsubscribe mailto:[EMAIL PROTECTED] Help mailto:[EMAIL PROTECTED] Archive http://www.ee.ethz.ch/~slist/rrd-users WebAdmin http://www.ee.ethz.ch/~slist/lsg2.cgi
