I'm launching a Spark shell with the following parameters

./spark-shell --master yarn-client --executor-memory 32g --driver-memory 4g
--executor-cores 32 --num-executors 8

but when I look at the Spark UI it shows only 209.3 GB total memory.


Executors (10)

   - *Memory:* 55.9 GB Used (209.3 GB Total)

This is a 10 node YARN cluster where each node has 48G of memory.

Any idea what I'm missing here?

Thanks
-Soumya

Reply via email to