Hi guys,

I'm kind of dying on this one. I attached 3 bash scripts that show where
I don't understand things anymore. I guess its quite simple.

I feed the same values using the same CF and RRD specs everywhere, but
on the script rrdtest-days.sh, I get a graph that is a lot different to
the other two scripts.
Other than the --step size I did not change much (as a diff would show).

Why (and how?) is it normalizing when all I increase is the timespan
between updates? I'm sure I missed something..

Thank you very much in advance! :)

Peter

Attachment: rrdtest-days.sh
Description: application/shellscript

Attachment: rrdtest-minutes.sh
Description: application/shellscript

Attachment: rrdtest-seconds.sh
Description: application/shellscript

Attachment: signature.asc
Description: OpenPGP digital signature

_______________________________________________
rrd-users mailing list
[email protected]
https://lists.oetiker.ch/cgi-bin/listinfo/rrd-users

Reply via email to