Hello, Im asking mysef the exact meaning of the setting of spark.memory.storageFraction. The documentation mention:
"Amount of storage memory immune to eviction, expressed as a fraction of the size of the region set aside by spark.memory.fraction. The higher this is, the less working memory may be available to execution and tasks may spill to disk more often" Does that mean that if there is no caching that part of the memory will not be used at all? In the spark UI, in the tab "Executor", I can see that the "storage memory" is always zero. Does that mean that that part of the memory is never used at all and I can reduce it or never used for storage specifically? Thanks in advance for your help, Michel -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org