Well, that's actually what I need (one simple app, several contexts, similar to 
what JobServer does) and I'm just looking for some workaround here. 
Classloaders look a little easier for me than spawning my own processes.
Being more specific, I just need to be able to execute arbitrary Spark jobs 
from long lived web-application with no prior knowledge of those jobs, so I 
need to accept jars with those jobs (again, like JobServer).

As far as I understand I can't load jars to SparkContext which has spawned 
executors on the cluster. I need to create new one to load new jars. Am I right 
on this?


-----Original Message-----
From: Sean Owen [mailto:so...@cloudera.com] 
Sent: Thursday, December 18, 2014 2:04 AM
To: Anton Brazhnyk
Cc: user@spark.apache.org
Subject: Re: SPARK-2243 Support multiple SparkContexts in the same JVM

Yes, although once you have multiple ClassLoaders, you are operating as if in 
multiple JVMs for most intents and purposes. I think the request for this kind 
of functionality comes from use cases where multiple ClassLoaders wouldn't 
work, like, wanting to have one app (in one ClassLoader) managing multiple 
contexts.

On Thu, Dec 18, 2014 at 2:23 AM, Anton Brazhnyk <anton.brazh...@genesys.com> 
wrote:
> Greetings,
>
>
>
> First comment on the issue says that reason for non-supporting of 
> multiple contexts is “There are numerous assumptions in the code base 
> that uses a shared cache or thread local variables or some global 
> identifiers which prevent us from using multiple SparkContext's.”
>
>
>
> May it be worked around by creating those context in several 
> classloaders with their own copies of Spark classes?
>
>
>
> Thanks,
>
> Anton

Reply via email to