Re: Instantiating/starting Spark jobs programmatically

2015-04-23 Thread Anshul Singhle
Hi firemonk9, What you're doing looks interesting. Can you share some more details? Are you running the same spark context for each job, or are you running a seperate spark context for each job? Does your system need sharing of rdd's across multiple jobs? If yes, how do you implement that? Also wh

Re: Instantiating/starting Spark jobs programmatically

2015-04-23 Thread Dean Wampler
I strongly recommend spawning a new process for the Spark jobs. Much cleaner separation. Your driver program won't be clobbered if the Spark job dies, etc. It can even watch for failures and restart. In the Scala standard library, the sys.process package has classes for constructing and interopera

Re: Instantiating/starting Spark jobs programmatically

2015-04-21 Thread Steve Loughran
On 21 Apr 2015, at 17:34, Richard Marscher mailto:rmarsc...@localytics.com>> wrote: - There are System.exit calls built into Spark as of now that could kill your running JVM. We have shadowed some of the most offensive bits within our own application to work around this. You'd likely want to d

Re: Instantiating/starting Spark jobs programmatically

2015-04-21 Thread Richard Marscher
> hope this helps. > > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Instantiating-starting-Spark-jobs-programmatically-tp22577p22584.html >

Re: Instantiating/starting Spark jobs programmatically

2015-04-20 Thread firemonk9
message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Instantiating-starting-Spark-jobs-programmatically-tp22577p22584.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e

Instantiating/starting Spark jobs programmatically

2015-04-20 Thread Ajay Singal
nformation/tips/best-practices in this regard? Cheers! Ajay -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Instantiating-starting-Spark-jobs-programmatically-tp22577.html Sent from the Apache Spark User List mailing list archive at