Thanks for sharing Yiannis, looks very promising!

Do you know if I can package a custom class with my application, or does it
have to be pre-deployed on all Executor nodes?

On Wed, Feb 3, 2016 at 10:36 AM, Yiannis Gkoufas <johngou...@gmail.com>
wrote:

> Hi Matt,
>
> there is some related work I recently did in IBM Research for visualizing
> the metrics produced.
> You can read about it here
> http://www.spark.tc/sparkoscope-enabling-spark-optimization-through-cross-stack-monitoring-and-visualization-2/
> We recently opensourced it if you are interested to have a deeper look to
> it: https://github.com/ibm-research-ireland/sparkoscope
>
> Thanks,
> Yiannis
>
> On 3 February 2016 at 13:32, Matt K <matvey1...@gmail.com> wrote:
>
>> Hi guys,
>>
>> I'm looking to create a custom sync based on Spark's Metrics System:
>>
>> https://github.com/apache/spark/blob/9f603fce78fcc997926e9a72dec44d48cbc396fc/core/src/main/scala/org/apache/spark/metrics/MetricsSystem.scala
>>
>> If I want to collect metrics from the Driver, Master, and Executor nodes,
>> should the jar with the custom class be installed on Driver, Master, and
>> Executor nodes?
>>
>> Also, on Executor nodes, does the MetricsSystem run inside the Executor's
>> JVM?
>>
>> Thanks,
>> -Matt
>>
>
>


-- 
www.calcmachine.com - easy online calculator.

Reply via email to