Hi,

I'm searching for some information about run programs concurrently on Spark.

I did a simple experiment on the Spark Master, that is open two terminals, then 
submitted two programs to Master in the same time and watch the programs status 
via Spark Master Web UI. I found one program could start to running and another 
one was waiting for the first one to completed. But my aim is to run the 
programs concurrently not in FIFO order.

I though the Mesos Mode could let the programs run concurrently, so I tried set 
Spark runs in Mesos Mode by "conf.set("spark.mesos.coarse", "true")", but the 
programs was still running one by one. Am I wrong?

And I also read this thread  
http://apache-spark-user-list.1001560.n3.nabble.com/launching-concurrent-jobs-programmatically-td4990.html
  where Andrew Ash mentioned that I can "submit multiple jobs through the same 
SparkContext via different threads". So now I'm wondering how to submit 
multiple jobs via different threads?

Thanks,
Haoming
                                          

Reply via email to