Hi All,

My cluster has 5 workers each having 4 cores (So total 20 cores).It is  in
stand alone mode (not using Mesos or Yarn).I want two programs to run at
same time. So I have configured "spark.cores.max=3" , but when i run the
program it allocates three cores taking one core from each worker making 3
workers to run the program ,

How to configure such that it takes 3 cores from 1 worker so that i can use
other workers for second program.

Thanks in advance
Subacini

Reply via email to