I think I found the problem.
Have to change the yarn capacity scheduler to use
DominantResourceCalculator
Thanks!
On Fri, Sep 25, 2015 at 4:54 AM, Akhil Das
wrote:
> Which version of spark are you having? Can you also check whats set in
> your conf/spark-defaults.conf file?
>
> Thanks
> Best
Which version of spark are you having? Can you also check whats set in your
conf/spark-defaults.conf file?
Thanks
Best Regards
On Fri, Sep 25, 2015 at 1:58 AM, Gavin Yue wrote:
> Running Spark app over Yarn 2.7
>
> Here is my sparksubmit setting:
> --master yarn-cluster \
> --num-executors 100
Running Spark app over Yarn 2.7
Here is my sparksubmit setting:
--master yarn-cluster \
--num-executors 100 \
--executor-cores 3 \
--executor-memory 20g \
--driver-memory 20g \
--driver-cores 2 \
But the executor cores setting is not working. It always assigns only one
vcore to one containe