RE: [Spark Streaming] - Killing application from within code

2017-05-04 Thread Sidney Feiner
l.com] Sent: Wednesday, May 3, 2017 10:25 PM To: Sidney Feiner Cc: user@spark.apache.org Subject: Re: [Spark Streaming] - Killing application from within code There isnt a clean programmatic way to kill the application running in the driver from the executor. You will have to set up addition RPC

Re: [Spark Streaming] - Killing application from within code

2017-05-03 Thread Tathagata Das
There isnt a clean programmatic way to kill the application running in the driver from the executor. You will have to set up addition RPC mechanism to explicitly send a signal from the executors to the application/driver to quit. On Wed, May 3, 2017 at 8:44 AM, Sidney Feiner wrote: > Hey, I'm us

[Spark Streaming] - Killing application from within code

2017-05-03 Thread Sidney Feiner
Hey, I'm using connections to Elasticsearch from within my Spark Streaming application. I'm using Futures to maximize performance when it sends network requests to the ES cluster. Basically, I want my app to crash if any one of the executors fails to connect to ES. The exception gets catched an