Github user hero0926 commented on the issue:

    https://github.com/apache/zeppelin/pull/2011
  
    We're meeting same problem as karuppayya. when we trying to unpersist in 
pyspark+zeppelin, memory dosen't released - so sc.stop for second hand, 
zeppelin interpreter dies... I concern OOM error couldn't be escaped but is 
there any possible way to use pyspark without this problem.
    
    @karuppayya , how'bout uncache and make another sc or make sc to null 
several times.. I think make sc newer in code is alter way to escape this 
problem.


---

Reply via email to