Try with *spark.cores.max*, executor cores is used when you usually run it
on yarn mode.

Thanks
Best Regards

On Mon, Jul 6, 2015 at 1:22 AM, nizang <[email protected]> wrote:

> hi,
>
> We're running spark 1.4.0 on ec2, with 6 machines, 4 cores each. We're
> trying to run an application on a number of total-executor-cores. but we
> want it to run on the minimal number of machines as possible (e.g.
> total-executor-cores=4, we'll want single machine. total-executor-cores=12,
> we'll want 3 machines)
>
> I'm running spark shell, in the following command:
>
> /root/spark/bin/spark-shell --total-executor-cores X --executor-cores 4
>
> or
>
> /root/spark/bin/spark-shell --total-executor-cores X
>
> and checked the cores on the spark UI, and found the following:
>
>
>
>
> Req total-executor-cores        Actual cores with executor-cores param
> Actual cores
> without executor-cores=4 param
> 24      24      24
> 22      22      16
> 20      20      8
> 16      16      0
> 12      12      0
> 8       8       0
> 4       4       0
>
> our questions:
>
> 1) Why we don't always get the number of cores we asked for when passing
> the
> "executor-cores 4" parameter? It seems that the number of cores we actually
> get is something like "max(24-(24-REQ_TOTAL_CORES)*4, 0)"
>
> 2) How can we get our original request? get the cores in minimal number of
> machines? When playing with the executor-cores, we have the problem
> described in (1), but the cores are on minimal number of cores
>
> 3) Playing with the parameter spark.deploy.spreadOut didn't seem to help
> with our request
>
> thanks, nizan
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/cores-and-resource-management-tp23628.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>
>

Reply via email to