Re: Set the number/memory of workers under mesos

2014-06-21 Thread Mayur Rustagi
You can do that after as well, it changes application wide settings for subsequent task. On 20 Jun 2014 17:05, "Shuo Xiang" wrote: > Hi Mayur, > Are you referring to overriding the default sc in sparkshell? Is there > any way to do that before running the shell? > > > On Fri, Jun 20, 2014 at 1

Re: Set the number/memory of workers under mesos

2014-06-20 Thread Shuo Xiang
Hi Mayur, Are you referring to overriding the default sc in sparkshell? Is there any way to do that before running the shell? On Fri, Jun 20, 2014 at 1:40 PM, Mayur Rustagi wrote: > You should be able to configure in spark context in Spark shell. > spark.cores.max & memory. > Regards > Mayur

Re: Set the number/memory of workers under mesos

2014-06-20 Thread Mayur Rustagi
You should be able to configure in spark context in Spark shell. spark.cores.max & memory. Regards Mayur Mayur Rustagi Ph: +1 (760) 203 3257 http://www.sigmoidanalytics.com @mayur_rustagi On Fri, Jun 20, 2014 at 4:30 PM, Shuo Xiang wrote: > Hi, just wonderi

Set the number/memory of workers under mesos

2014-06-20 Thread Shuo Xiang
Hi, just wondering anybody knows how to set up the number of workers (and the amount of memory) in mesos, while lauching spark-shell? I was trying to edit conf/spark-env.sh and it looks like that the environment variables are for YARN of standalone. Thanks!