You should be able to kill the job using the webUI or via spark-class.
More info can be found in the thread:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-kill-a-Spark-job-running-in-cluster-mode-td18583.html.


HTH!

On Tue, Dec 23, 2014 at 4:47 PM, durga <durgak...@gmail.com> wrote:

> Hi All ,
>
> It seems problem is little more complicated.
>
> If the job is hungup on reading s3 file.even if I kill the unix process
> that
> started the job, it is not killing spark-job. It is still hung up there.
>
> Now the questions are :
>
> How do I find spark-job based on the name?
> How do I kill the spark-job based on the name of the job?.
>
> Thanks for helping me.
> -D
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/S3-files-Spark-job-hungsup-tp20806p20842.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to