restrict my spark app to run on specific machines

2016-05-04 Thread Shams ul Haque
Hi, I have a cluster of 4 machines for Spark. I want my Spark app to run on 2 machines only. And rest 2 machines for other Spark apps. So my question is, can I restrict my app to run on that 2 machines only by passing some IP at the time of setting SparkConf or by any other setting? Thanks, Sham

Re: Spark streaming app starts processing when kill that app

2016-05-03 Thread Shams ul Haque
Hey Hareesh, Thanks for the help, they were starving. I increased the core + memory on that machine. Now it is working fine. Thanks again On Tue, May 3, 2016 at 12:57 PM, Shams ul Haque wrote: > No, i made a cluster of 2 machines. And after submission to master, this > app moves on

Re: Spark streaming app starts processing when kill that app

2016-05-03 Thread Shams ul Haque
cess any data coming >> from kafka. And when i kill that app by pressing Ctrl + C on terminal, then >> it start processing all data received from Kafka and then get shutdown. >> >> I am trying to figure out why is this happening. Please help me if you >> know anything. >> >> Thanks and regards >> Shams ul Haque >> > >

Spark streaming app starts processing when kill that app

2016-05-03 Thread Shams ul Haque
start processing all data received from Kafka and then get shutdown. I am trying to figure out why is this happening. Please help me if you know anything. Thanks and regards Shams ul Haque

Re: kill Spark Streaming job gracefully

2016-03-14 Thread Shams ul Haque
Any one have any idea? or should i raise a bug for that? Thanks, Shams On Fri, Mar 11, 2016 at 3:40 PM, Shams ul Haque wrote: > Hi, > > I want to kill a Spark Streaming job gracefully, so that whatever Spark > has picked from Kafka have processed. My Spark version is: 1.6.0 >

kill Spark Streaming job gracefully

2016-03-11 Thread Shams ul Haque
Hi, I want to kill a Spark Streaming job gracefully, so that whatever Spark has picked from Kafka have processed. My Spark version is: 1.6.0 When i tried killing a Spark Streaming Job from Spark UI dosen't stop app completely. In Spark-UI job is moved to COMPLETED section, but in log it continuou

Re: does spark needs dedicated machines to run on

2016-03-10 Thread Shams ul Haque
pastebin.com/0LjTWLfm Thanks Shams On Thu, Mar 10, 2016 at 8:11 PM, Ted Yu wrote: > Can you provide a bit more information ? > > Release of Spark > command for submitting your app > code snippet of your app > pastebin of log > > Thanks > > On Thu, Mar 10, 2016 at

does spark needs dedicated machines to run on

2016-03-10 Thread Shams ul Haque
Hi, I have developed a spark realtime app and started spark-standalone on my laptop. But when i tried to submit that app in Spark it is always in WAITING state & Cores is always Zero. I have set: export SPARK_WORKER_CORES="2" export SPARK_EXECUTOR_CORES="1" in spark-env.sh, but still nothing hap

using MongoDB Tailable Cursor in Spark Streaming

2016-03-07 Thread Shams ul Haque
Hi, I want to implement Streaming using Mongo Tailable. Please give me hint how can i do this. I think i have to extend some class and used its method to do the stuff. Please give me a hint. Thanks and regards Shams ul Haque

Re: merge 3 different types of RDDs in one

2015-12-01 Thread Shams ul Haque
klaskowski/ | > http://blog.jaceklaskowski.pl > Mastering Spark https://jaceklaskowski.gitbooks.io/mastering-apache-spark/ > Follow me at https://twitter.com/jaceklaskowski > Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski > > > On Tue, Dec 1, 2015 at 10:47 AM, Sh

merge 3 different types of RDDs in one

2015-12-01 Thread Shams ul Haque
Hi All, I have made 3 RDDs of 3 different dataset, all RDDs are grouped by CustomerID in which 2 RDDs have value of Iterable type and one has signle bean. All RDDs have id of Long type as CustomerId. Below are the model for 3 RDDs: JavaPairRDD> JavaPairRDD> JavaPairRDD Now, i have to merge all th

Separate all values from Iterable

2015-10-27 Thread Shams ul Haque
Hi, I have grouped all my customers in JavaPairRDD> by there customerId (of Long type). Means every customerId have a List or ProductBean. Now i want to save all ProductBean to DB irrespective of customerId. I got all values by using method JavaRDD> values = custGroupRDD.values(); Now i want to