Hello Michel,

Spark seperates executors memory using an adaptive boundary between storage and execution memory. If there is no caching and execution memory needs more space, then it will use a portion of the storage memory.

If your program does not use caching then you can reduce storage memory.

Iacovos

On 20/3/20 4:40 μ.μ., msumbul wrote:
Hello,

Im asking mysef the exact meaning of the setting of
spark.memory.storageFraction.
The documentation mention:

"Amount of storage memory immune to eviction, expressed as a fraction of the
size of the region set aside by spark.memory.fraction. The higher this is,
the less working memory may be available to execution and tasks may spill to
disk more often"

Does that mean that if there is no caching that part of the memory will not
be used at all?
In the spark UI, in the tab "Executor", I can see that the "storage memory"
is always zero. Does that mean that that part of the memory is never used at
all and I can reduce it or never used for storage specifically?

Thanks in advance for your help,
Michel



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to