Re: Stop Spark Streaming Jobs

2016-08-04 Thread Sandeep Nemuri
o sorry. Your name was Pradeep !! >>> > >>> > -Original Message- >>> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com] >>> > Sent: Wednesday, August 03, 2016 11:24 AM >>> > To: 'Pradeep'; 'user@spark.apache.org

Re: Stop Spark Streaming Jobs

2016-08-04 Thread Sandeep Nemuri
> > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com] >> > Sent: Wednesday, August 03, 2016 11:24 AM >> > To: 'Pradeep'; 'user@spark.apache.org' >> > Subject: RE: Stop Spark Streaming Jobs >> > >> > Hi. Paradeep >> &

Re: Stop Spark Streaming Jobs

2016-08-03 Thread Tony Lane
t; > So sorry. Your name was Pradeep !! > > > > -Original Message- > > From: Park Kyeong Hee [mailto:kh1979.p...@samsung.com] > > Sent: Wednesday, August 03, 2016 11:24 AM > > To: 'Pradeep'; 'user@spark.apache.org' > > Subject: R

Re: Stop Spark Streaming Jobs

2016-08-02 Thread Pradeep
msung.com] > Sent: Wednesday, August 03, 2016 11:24 AM > To: 'Pradeep'; 'user@spark.apache.org' > Subject: RE: Stop Spark Streaming Jobs > > Hi. Paradeep > > > Did you mean, how to kill the job? > If yes, you should kill the driver and follow next.

RE: Stop Spark Streaming Jobs

2016-08-02 Thread Park Kyeong Hee
Hi. Paradeep Did you mean, how to kill the job? If yes, you should kill the driver and follow next. on yarn-client 1. find pid - "ps -es | grep " 2. kill it - "kill -9 " 3. check executors were down - "yarn application -list" on yarn-cluster 1. find driver's application ID - "yarn application -