>From my experience with spark 1.3.1 you can also set
spark.executor.memoryOverhead to about 7-10% of your spark.executor.memory.
Total of which will be requested for a Yarn container.
On Tue, Jan 26, 2016 at 4:20 AM, Xiaoyu Ma
wrote:
> Hi all,
> I saw spark 1.6 has new off heap settings: spark.
Hi all,
I saw spark 1.6 has new off heap settings: spark.memory.offHeap.size
The doc said we need to shrink on heap size accordingly. But on Yarn on-heap
and yarn limit is set all together via spark.executor.memory (jvm opts for
memory is not allowed according to doc), how can we set executor JVM