In terms of samples fetched from the DB which is a cost limiting factor in 
our set up, what is the overhead of rate() compared to increase(). Based on 
my reading so far, rate() requires all data points within the time range 
interval thus it will fetch all data points from storage. On the other 
hand, the increase() function would fetch the first and last data point + 
penultimate data points for interpolation/extrapolation. Is it correct to 
state that increase() has lower overhead than rate() in terms of samples 
fetched with the overhead scaling up with time range interval?

thanks
Johny

-- 
You received this message because you are subscribed to the Google Groups 
"Prometheus Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/prometheus-users/c6b65724-696c-4e5a-8606-8e4cfba98b84n%40googlegroups.com.

Reply via email to