Running Spark app over Yarn 2.7

Here is my sparksubmit setting:
--master yarn-cluster \
 --num-executors 100 \
 --executor-cores 3 \
 --executor-memory 20g \
 --driver-memory 20g \
 --driver-cores 2 \

But the executor cores setting is not working. It always assigns only one
vcore  to one container based on the cluster metrics from yarn resource
manager website.

And yarn setting for container is
min: <memory:6600, vCores:4>  max: <memory:106473, vCores:15>

I have tried to change num-executors and executor memory. It even ignores
the min cCores setting and always assign one core per container.

Any advice?

Thank you!

Reply via email to