Good afternoon, I'm attempting to integrate the metrics generated via JMX into our internal framework; however, the information for several of the metrics includes a One/Five/Fifteen-minute "rate", with the RateUnit in "SECONDS". For example:
$>get -b > org.apache.cassandra.metrics:name=Latency,scope=Write,type=ClientRequest * > #mbean = > org.apache.cassandra.metrics:name=Latency,scope=Write,type=ClientRequest: > LatencyUnit = MICROSECONDS; > > EventType = calls; > > RateUnit = SECONDS; > > MeanRate = 383.6944837362387; > > FifteenMinuteRate = 868.8420188648543; > > FiveMinuteRate = 817.5239450236011; > > OneMinuteRate = 675.7673129014964; > > Max = 498867.0; > > Count = 31257426; > > Min = 52.0; > > 50thPercentile = 926.0; > > Mean = 1063.114029159023; > > StdDev = 1638.1542477604232; > > 75thPercentile = 1064.75; > > 95thPercentile = 1304.55; > > 98thPercentile = 1504.3999999999992; > > 99thPercentile = 2307.3500000000104; > > 999thPercentile = 10491.850000000002; > What does the rate signify in this context? For example, given the OneMinuteRate of 675.7673129014964 and the unit of "seconds"--what is this measuring? Is this the rate of which metrics are submitted? i.e., there were an average of (676 * 60 seconds) metrics submitted over the last minute? Thanks!