RE: Mulitple Spark Context

2014-11-14 Thread Charles
Thanks for your reply! Can you be more specific about the JVM? Is JVM referring to the driver application? if I want to create multiple sparkContext, I will need start a driver application instance for each sparkContext? -- View this message in context: http://apache-spark-user-list.1001560.n

RE: Mulitple Spark Context

2014-11-14 Thread Bui, Tri
Does this also apply to StreamingContext ? What issue would I have if I have 1000s of StreaminContext ? Thanks Tri From: Daniil Osipov [mailto:daniil.osi...@shazam.com] Sent: Friday, November 14, 2014 3:47 PM To: Charles Cc: u...@spark.incubator.apache.org Subject: Re: Mulitple Spark Context

Re: Mulitple Spark Context

2014-11-14 Thread Daniil Osipov
Its not recommended to have multiple spark contexts in one JVM, but you could launch a separate JVM per context. How resources get allocated is probably outside the scope of Spark, and more of a task for the cluster manager. On Fri, Nov 14, 2014 at 12:58 PM, Charles wrote: > I need continuously