Hi,

Thanks Sachin, the workaround that you proposed to get the metric values from 
the JMX beans works: You need to specify the type before the client id. 
However, without this workaround it does not work, so this will probably become 
an obstacle for the users. The original problem description: 
http://search-hadoop.com/m/Kafka/uyzND1HWVV61uorew1?subj=Cannot+access+Kafka+Streams+JMX+metrics+using+jmxterm.

Problem regarding custom metrics reporters:

However, I encountered another problem regarding metrics. Two persons Michael 
Ross and Anish Mashankar had a related problem: 
http://search-hadoop.com/m/Kafka/uyzND1mggcGVyRJg2?subj=Kafka+metrics+always+reporting+zero
 and 
http://search-hadoop.com/m/Kafka/uyzND1n8QXT1WAsRy?subj=Kafka+0+10+0+MetricsReporter+implementation+only+reporting+zeros.

I experience the same behavior like Michael Ross. The initialization works, the 
callback method gets called with 0.0d and -Inf values for the metrics. Like in 
Michael Ross' problem description I checked the values with JMX and here it 
works. My metrics reporter looks like the following for debugging:

class TraceReporter extends MetricsReporter with Logging { 
  override def init(list: JList[KafkaMetric]): Unit = ()
  override def metricRemoval(kafkaMetric: KafkaMetric): Unit = ()
  override def close(): Unit = ()
  override def metricChange(kafkaMetric: KafkaMetric): Unit = 
log.trace(s"change ${kafkaMetric.metricName()}-> ${new 
Double(kafkaMetric.value())}")
  override def configure(map: JMap[String, _]): Unit = ()
}

Michael Ross or Anish Mashankar, did you find a solution for this problem?

Thanks,
Jendrik Poloczek

Reply via email to