Hi All: I'm using RRD to collect data about a DNS server, using a C program who gather the data and put it on the RRD file each 30 seconds.
When I kill the program, for those intervals I haven't put data, the graph show them as a veru high value. Por example, if I've got samples around 200 each time, for the period where there is no data, the graph shows values near to 6M or more. How can avoid this behaviour? An example could be seen here: http://nicolette.nic.cl/~secastro/last12h.png Best Regards -- Sebastian E. Castro Avila [EMAIL PROTECTED] Administrador de DNS, NIC Chile Agustinas 1357 Piso 4 Santiago, Chile Cod. Postal 6500587 Phone: +56-2-9407705 Fax : +56-2-9407701 -- Unsubscribe mailto:[EMAIL PROTECTED] Help mailto:[EMAIL PROTECTED] Archive http://www.ee.ethz.ch/~slist/rrd-users WebAdmin http://www.ee.ethz.ch/~slist/lsg2.cgi
