l.com]
Sent: Wednesday, May 3, 2017 10:25 PM
To: Sidney Feiner
Cc: user@spark.apache.org
Subject: Re: [Spark Streaming] - Killing application from within code
There isnt a clean programmatic way to kill the application running in the
driver from the executor. You will have to set up addition RPC
There isnt a clean programmatic way to kill the application running in the
driver from the executor. You will have to set up addition RPC mechanism to
explicitly send a signal from the executors to the application/driver to
quit.
On Wed, May 3, 2017 at 8:44 AM, Sidney Feiner
wrote:
> Hey, I'm us
Hey, I'm using connections to Elasticsearch from within my Spark Streaming
application.
I'm using Futures to maximize performance when it sends network requests to the
ES cluster.
Basically, I want my app to crash if any one of the executors fails to connect
to ES.
The exception gets catched an