Hi Denes,

I think you can register your customized metrics source into metrics system 
through metrics.properties, you can take metrics.propertes.template as 
reference,

Basically you can do as follow if you want to monitor on executor:

executor.source.accumulator.class=xx.xx.xx.your-customized-metrics-source

I think the below code can only register metrics source in client side.

SparkEnv.get.metricsSystem.registerSource(accumulatorMetrics);

BTW, it's not a good choice to register through MetricsSystem, it would be nice 
to register through configuration. Also you can enable console sink to verify 
whether the source is registered or not.

Thanks
Jerry


-----Original Message-----
From: Denes [mailto:te...@outlook.com] 
Sent: Tuesday, July 22, 2014 2:02 PM
To: u...@spark.incubator.apache.org
Subject: Re: Executor metrics in spark application

I'm also pretty interested how to create custom Sinks in Spark. I'm using it 
with Ganglia and the normal metrics from JVM source do show up. I tried to 
create my own metric based on Issac's code, but does not show up in Ganglia.
Does anyone know where is the problem?
Here's the code snippet: 

class AccumulatorSource(accumulator: Accumulator[Long], name: String) extends 
Source {
  
  val sourceName = "accumulator.metrics"
  val metricRegistry = new MetricRegistry()
  
  metricRegistry.register(MetricRegistry.name("accumulator", name), new 
Gauge[Long] {
     override def getValue: Long = {
            return accumulator.value;
  }});

}

and then in the main:
val longAccumulator = sc.accumulator[Long](0); val accumulatorMetrics = new 
AccumulatorSource(longAccumulator , "counters.accumulator"); 
SparkEnv.get.metricsSystem.registerSource(accumulatorMetrics);




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Executor-metrics-in-spark-application-tp188p10385.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to