o sorry. Your name was Pradeep !!
>>> >
>>> > -Original Message-
>>> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com]
>>> > Sent: Wednesday, August 03, 2016 11:24 AM
>>> > To: 'Pradeep'; 'user@spark.apache.org
> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com]
>> > Sent: Wednesday, August 03, 2016 11:24 AM
>> > To: 'Pradeep'; 'user@spark.apache.org'
>> > Subject: RE: Stop Spark Streaming Jobs
>> >
>> > Hi. Paradeep
>> &
t; > So sorry. Your name was Pradeep !!
> >
> > -Original Message-
> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com]
> > Sent: Wednesday, August 03, 2016 11:24 AM
> > To: 'Pradeep'; 'user@spark.apache.org'
> > Subject: R
msung.com]
> Sent: Wednesday, August 03, 2016 11:24 AM
> To: 'Pradeep'; 'user@spark.apache.org'
> Subject: RE: Stop Spark Streaming Jobs
>
> Hi. Paradeep
>
>
> Did you mean, how to kill the job?
> If yes, you should kill the driver and follow next.
Hi. Paradeep
Did you mean, how to kill the job?
If yes, you should kill the driver and follow next.
on yarn-client
1. find pid - "ps -es | grep "
2. kill it - "kill -9 "
3. check executors were down - "yarn application -list"
on yarn-cluster
1. find driver's application ID - "yarn application -