Re: Best practises of share Spark cluster over few applications

2016-02-14 Thread Alex Kozlov
Praveen, the mode in which you run spark (standalone, yarn, mesos) is determined when you create SparkContext . You are right that spark-submit and spark-shell create different SparkContexts. In general, resour

Re: Best practises of share Spark cluster over few applications

2016-02-14 Thread praveen S
Even i was trying to launch spark jobs from webservice : But I thought you could run spark jobs in yarn mode only through spark-submit. Is my understanding not correct? Regards, Praveen On 15 Feb 2016 08:29, "Sabarish Sasidharan" wrote: > Yes you can look at using the capacity scheduler or the

Re: Best practises of share Spark cluster over few applications

2016-02-14 Thread Sabarish Sasidharan
Yes you can look at using the capacity scheduler or the fair scheduler with YARN. Both allow using full cluster when idle. And both allow considering cpu plus memory when allocating resources which is sort of necessary with Spark. Regards Sab On 13-Feb-2016 10:11 pm, "Eugene Morozov" wrote: > Hi

Re: Best practises of share Spark cluster over few applications

2016-02-13 Thread Jörn Franke
This is possible with yarn. You also need to think about preemption in case one web service starts doing something and after a while another web service wants also to do something. > On 13 Feb 2016, at 17:40, Eugene Morozov wrote: > > Hi, > > I have several instances of the same web-service

Best practises of share Spark cluster over few applications

2016-02-13 Thread Eugene Morozov
Hi, I have several instances of the same web-service that is running some ML algos on Spark (both training and prediction) and do some Spark unrelated job. Each web-service instance creates their on JavaSparkContext, thus they're seen as separate applications by Spark, thus they're configured with