Re: Possible memory leak after closing spark context in v2.0.1

2016-10-17 Thread Lev Katzav
park-user-list.1001560.n3.nabble.com/ >> file/n27910/profiler.png> >> >> Since each test suite passes when running by itself, >> I think that the broadcasts are leaking between the tests suites. >> >> Any suggestions on how to resolve this? >> >&

Re: Possible memory leak after closing spark context in v2.0.1

2016-10-17 Thread Sean Owen
n running by itself, > I think that the broadcasts are leaking between the tests suites. > > Any suggestions on how to resolve this? > > thanks > > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Possible-memory

Possible memory leak after closing spark context in v2.0.1

2016-10-17 Thread lev
ist.1001560.n3.nabble.com/Possible-memory-leak-after-closing-spark-context-in-v2-0-1-tp27910.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Possible memory leak

2016-03-14 Thread adamreith
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Possible-memory-leak-tp26478.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@sp

possible memory leak when re-creating SparkContext's in the same JVM

2015-01-19 Thread Noam Barcay
Problem we're seeing is a gradual memory leak in the driver's JVM. Executing jobs using a long running Java app which creates relatively short-lived SparkContext's. So our Spark drivers are created as part of this application's JVM. We're using standalone cluster mode, spark 1.0.2 Root cause of t