I have a java class which calls SparkSubmit.scala with all the arguments to
run a spark job in a thread. I am running them in local mode for now but
also want to run them in yarn-cluster mode later.

Now, I want to kill the running spark job (which can be in local or
yarn-cluster mode) programmatically.

I know that SparkContext has a stop() method but from the thread from which
I am calling the SparkSubmit I don’t have access to it. Can someone suggest
me how to do this properly ?

Thanks.




--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/How-to-kill-a-Spark-job-running-in-local-mode-programmatically-tp8279.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to