Hi,
Were you able to setup custom metrics in GangliaSink? If so, how did you
register the custom metrics?
Thanks!
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Executor-metrics-in-spark-application-tp188p25647.html
Sent from the Apache Spark User List ma
As far as I understand even if I could register the custom source, there is
no way to have a cluster-wide variable to pass to it, i.e. the accumulator
can be modified by tasks, but only the driver can read it and the broadcast
value is constant.
So it seems this custom metrics/sinks fuctionality is
, July 22, 2014 6:38 PM
To: u...@spark.incubator.apache.org
Subject: RE: Executor metrics in spark application
Hi Jerry,
I know that way of registering a metrics, but it seems defeat the whole
purpose. I'd like to define a source that is set within the application, for
example number of p
Hi Jerry,
I know that way of registering a metrics, but it seems defeat the whole
purpose. I'd like to define a source that is set within the application, for
example number of parsed messages.
If I register it in the metrics.properties, how can I obtain the instance?
(or instances?)
How can I se
nsole sink to verify
whether the source is registered or not.
Thanks
Jerry
-Original Message-
From: Denes [mailto:te...@outlook.com]
Sent: Tuesday, July 22, 2014 2:02 PM
To: u...@spark.incubator.apache.org
Subject: Re: Executor metrics in spark application
I'm also pretty interes
I meant custom Sources, sorry.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Executor-metrics-in-spark-application-tp188p10386.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
I'm also pretty interested how to create custom Sinks in Spark. I'm using it
with Ganglia and the normal metrics from JVM source do show up. I tried to
create my own metric based on Issac's code, but does not show up in Ganglia.
Does anyone know where is the problem?
Here's the code snippet:
clas