Both are part of the heap.

> On 16 Aug 2016, at 04:26, Lan Jiang <ljia...@gmail.com> wrote:
> 
> Hello,
> 
> My understanding is that YARN executor container memory is based on 
> "spark.executor.memory" + “spark.yarn.executor.memoryOverhead”. The first one 
> is for heap memory and second one is for offheap memory. The 
> spark.executor.memory is used by -Xmx to set the max heap size. Now my 
> question is why it does not count permgen size and memory used by stack. They 
> are not part of the max heap size. IMHO, YARN executor container memory 
> should be set to:  spark.executor.memory  + [-XX:MaxPermSize] + 
> number_of_threads * [-Xss] + spark.yarn.executor.memoryOverhead. What did I 
> miss?
> 
> Lan
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to