you can use ganglia, ambari or nagios to monitor spark workers/masters. Spark executors are resilient. There are may proprietary software companies as well that just do hadoop application monitoring.
On Tue, Jun 27, 2017 at 5:03 PM, anna stax <annasta...@gmail.com> wrote: > Hi all, > > I have a spark standalone cluster. I am running a spark streaming > application on it and the deploy mode is client. I am looking for the best > way to monitor the cluster and application so that I will know when the > application/cluster is down. I cannot move to cluster deploy mode now. > > I appreciate your thoughts. > > Thanks > -Anna > -- [image: What's New with Xactly] <http://www.xactlycorp.com/email-click/> <https://www.nyse.com/quote/XNYS:XTLY> [image: LinkedIn] <https://www.linkedin.com/company/xactly-corporation> [image: Twitter] <https://twitter.com/Xactly> [image: Facebook] <https://www.facebook.com/XactlyCorp> [image: YouTube] <http://www.youtube.com/xactlycorporation>