A SparkContext is thread safe, so you can just have different threads
that create their own RDD's and do actions, etc.
- Patrick
On Mon, Dec 22, 2014 at 4:15 PM, Alessandro Baretta
wrote:
> Andrew,
>
> Thanks, yes, this is what I wanted: basically just to start multiple jobs
> concurrently in th
Andrew,
Thanks, yes, this is what I wanted: basically just to start multiple jobs
concurrently in threads.
Alex
On Mon, Dec 22, 2014 at 4:04 PM, Andrew Ash wrote:
>
> Hi Alex,
>
> SparkContext.submitJob() is marked as experimental -- most client programs
> shouldn't be using it. What are you l
Hi Alex,
SparkContext.submitJob() is marked as experimental -- most client programs
shouldn't be using it. What are you looking to do?
For multiplexing jobs, one thing you can do is have multiple threads in
your client JVM each submit jobs on your SparkContext job. This is
described here in the