I am planning to write custom monitoring application for that i analyse  
org.apache.spark.streaming.scheduler.StreamingListener, is there another spark 
streaming api which can give me the insight of the cluster like total 
processing time, delay, etc
Tarun

Date: Tue, 14 Oct 2014 23:19:47 +0530
Subject: Re: Spark Cluster health check
From: ak...@sigmoidanalytics.com
To: bigdat...@live.com
CC: user@spark.apache.org

Yes, for that you can tweak nagios a bit or you can write a custom monitoring 
applicaton which will check the processing delay etc.ThanksBest Regards

On Tue, Oct 14, 2014 at 10:16 PM, Tarun Garg <bigdat...@live.com> wrote:



Thanks for your response, it is not about infrastructure because I am using EC2 
machines and Amazon cloud watch can provide EC2 nodes cpu usage, memory usage 
details but I need to send notification in situation like processing delay, 
total delay, Maximum rate is low,etc. 
Tarun

Date: Tue, 14 Oct 2014 12:09:35 +0530
Subject: Re: Spark Cluster health check
From: ak...@sigmoidanalytics.com
To: bigdat...@live.com
CC: user@spark.apache.org

Hi Tarun, 
You can use Ganglia for monitoring the entire cluster, and if you want some 
more custom functionality like sending emails etc, then you can go after 
nagios.ThanksBest Regards

On Tue, Oct 14, 2014 at 3:31 AM, Tarun Garg <bigdat...@live.com> wrote:



Hi All,
I am doing a POC and written a Job in java. so the architecture has kafka and 
spark.Now i want a process to notify me whenever system performance is getting 
down or in crunch of resources, like CPU or RAM. I understand 
org.apache.spark.streaming.scheduler.StreamingListener, but it has very limited 
functionality.
Can any one suggest me a way (API) to write a Spark cluster watcher.
Thanks                                    

                                          

                                          

Reply via email to