Spill-overs are a common issue for in-memory computing systems, after all
memory is limited. In Spark where RDDs are immutable, if an RDD got created
with its size > 1/2 node's RAM then a transformation and generation of the
consequent RDD' can potentially fill all the node's memory that can cause
Hi,
Which version of Spark you use?
The recent one cannot handle this kind of spilling, see:
http://spark.apache.org/docs/latest/tuning.html#memory-management-overview.
// maropu
On Fri, May 13, 2016 at 8:07 AM, Ashok Kumar
wrote:
> Hi,
>
> How one can avoid having Spark spill over after filli
Hi,
How one can avoid having Spark spill over after filling the node's memory.
Thanks