Any one have any idea? or should i raise a bug for that?

Thanks,
Shams

On Fri, Mar 11, 2016 at 3:40 PM, Shams ul Haque <sham...@cashcare.in> wrote:

> Hi,
>
> I want to kill a Spark Streaming job gracefully, so that whatever Spark
> has picked from Kafka have processed. My Spark version is: 1.6.0
>
> When i tried killing a Spark Streaming Job from Spark UI dosen't stop app
> completely. In Spark-UI job is moved to COMPLETED section, but in log it
> continuously gives error: http://pastebin.com/TbGrdzA2
> and process is still visible with *ps* command.
>
>
> I also tried to stop by using below command:
>     *bin/spark-submit --master spark://shams-cashcare:7077 --kill
> app-20160311121141-0002*
> but it gives me error as:
>     Unable to connect to server spark://shams-cashcare:7077
>
> I have confirmed the Spark master host:port and they are OK. I also added
> ShutdownHook in code.
> What am i missing? Or if i am doing something wrong then please guide me.
>

Reply via email to