HI,

I am stuck here, my cluster is not effficiently utilized . Appreciate any
input on this.

Thanks
Subacini


On Sat, Jun 7, 2014 at 10:54 PM, Subacini B <subac...@gmail.com> wrote:

> Hi All,
>
> My cluster has 5 workers each having 4 cores (So total 20 cores).It is  in
> stand alone mode (not using Mesos or Yarn).I want two programs to run at
> same time. So I have configured "spark.cores.max=3" , but when i run the
> program it allocates three cores taking one core from each worker making 3
> workers to run the program ,
>
> How to configure such that it takes 3 cores from 1 worker so that i can
> use other workers for second program.
>
> Thanks in advance
> Subacini
>

Reply via email to