Re: Spark Worker Core Allocation

2014-06-08 Thread Subacini B
Thanks Sean, let me try to set spark.deploy.spreadOut as false. On Sun, Jun 8, 2014 at 12:44 PM, Sean Owen wrote: > Have a look at: > > https://spark.apache.org/docs/1.0.0/job-scheduling.html > https://spark.apache.org/docs/1.0.0/spark-standalone.html > > The default is to grab resource on al

Re: Spark Worker Core Allocation

2014-06-08 Thread Sean Owen
Have a look at: https://spark.apache.org/docs/1.0.0/job-scheduling.html https://spark.apache.org/docs/1.0.0/spark-standalone.html The default is to grab resource on all nodes. In your case you could set spark.cores.max to 2 or less to enable running two apps on a cluster of 4-core machines simult

Re: Spark Worker Core Allocation

2014-06-08 Thread Subacini B
HI, I am stuck here, my cluster is not effficiently utilized . Appreciate any input on this. Thanks Subacini On Sat, Jun 7, 2014 at 10:54 PM, Subacini B wrote: > Hi All, > > My cluster has 5 workers each having 4 cores (So total 20 cores).It is in > stand alone mode (not using Mesos or Yarn)

Spark Worker Core Allocation

2014-06-07 Thread Subacini B
Hi All, My cluster has 5 workers each having 4 cores (So total 20 cores).It is in stand alone mode (not using Mesos or Yarn).I want two programs to run at same time. So I have configured "spark.cores.max=3" , but when i run the program it allocates three cores taking one core from each worker mak