A SparkContext is thread safe, so you can just have different threads
that create their own RDD's and do actions, etc.
- Patrick
On Mon, Dec 22, 2014 at 4:15 PM, Alessandro Baretta
wrote:
> Andrew,
>
> Thanks, yes, this is what I wanted: basically just to start multiple jobs
> concurrently in th
Andrew,
Thanks, yes, this is what I wanted: basically just to start multiple jobs
concurrently in threads.
Alex
On Mon, Dec 22, 2014 at 4:04 PM, Andrew Ash wrote:
>
> Hi Alex,
>
> SparkContext.submitJob() is marked as experimental -- most client programs
> shouldn't be using it. What are you l
Hi Alex,
SparkContext.submitJob() is marked as experimental -- most client programs
shouldn't be using it. What are you looking to do?
For multiplexing jobs, one thing you can do is have multiple threads in
your client JVM each submit jobs on your SparkContext job. This is
described here in the
Fellow Sparkers,
I'm rather puzzled at the submitJob API. I can't quite figure out how it is
supposed to be used. Is there any more documentation about it?
Also, is there any simpler way to multiplex jobs on the cluster, such as
starting multiple computations in as many threads in the driver and