I am not pretty sure, but: - if RDD persisted in memory then on task fail executor JVM process fails too, so the memory is released - if RDD persisted on disk then on task fail Spark shutdown hook just wipes temp files
On Thu, Mar 23, 2017 at 10:55 AM, Jörn Franke <jornfra...@gmail.com> wrote: > What do you mean by clear ? What is the use case? > > On 23 Mar 2017, at 10:16, nayan sharma <nayansharm...@gmail.com> wrote: > > Does Spark clears the persisted RDD in case if the task fails ? > > Regards, > Nayan > >