Hi Mayur,
  Are you referring to overriding the default sc in sparkshell? Is there
any way to do that before running the shell?


On Fri, Jun 20, 2014 at 1:40 PM, Mayur Rustagi <mayur.rust...@gmail.com>
wrote:

> You should be able to configure in spark context in Spark shell.
> spark.cores.max & memory.
> Regards
> Mayur
>
> Mayur Rustagi
> Ph: +1 (760) 203 3257
> http://www.sigmoidanalytics.com
> @mayur_rustagi <https://twitter.com/mayur_rustagi>
>
>
>
> On Fri, Jun 20, 2014 at 4:30 PM, Shuo Xiang <shuoxiang...@gmail.com>
> wrote:
>
>> Hi, just wondering anybody knows how to set up the number of workers (and
>> the amount of memory) in mesos, while lauching spark-shell? I was trying to
>> edit conf/spark-env.sh and it looks like that the environment variables are
>> for YARN of standalone. Thanks!
>>
>>
>>
>>
>

Reply via email to