Thanks Sean, let me try to set spark.deploy.spreadOut as false.
On Sun, Jun 8, 2014 at 12:44 PM, Sean Owen wrote:
> Have a look at:
>
> https://spark.apache.org/docs/1.0.0/job-scheduling.html
> https://spark.apache.org/docs/1.0.0/spark-standalone.html
>
> The default is to grab resource on al
Have a look at:
https://spark.apache.org/docs/1.0.0/job-scheduling.html
https://spark.apache.org/docs/1.0.0/spark-standalone.html
The default is to grab resource on all nodes. In your case you could set
spark.cores.max to 2 or less to enable running two apps on a cluster of
4-core machines simult
HI,
I am stuck here, my cluster is not effficiently utilized . Appreciate any
input on this.
Thanks
Subacini
On Sat, Jun 7, 2014 at 10:54 PM, Subacini B wrote:
> Hi All,
>
> My cluster has 5 workers each having 4 cores (So total 20 cores).It is in
> stand alone mode (not using Mesos or Yarn)
Hi All,
My cluster has 5 workers each having 4 cores (So total 20 cores).It is in
stand alone mode (not using Mesos or Yarn).I want two programs to run at
same time. So I have configured "spark.cores.max=3" , but when i run the
program it allocates three cores taking one core from each worker mak