I am getting excessive memory leak warnings when running multiple mapping and aggregations and using DataSets. Is there anything I should be looking for to resolve this or is this a known issue?
WARN [Executor task launch worker-0] org.apache.spark.memory.TaskMemoryManager - leak 16.3 MB memory from org.apache.spark.unsafe.map.BytesToBytesMap@33fb6a15 WARN [Executor task launch worker-0] org.apache.spark.memory.TaskMemoryManager - leak a page: org.apache.spark.unsafe.memory.MemoryBlock@29e74a69 in task 88341 WARN [Executor task launch worker-0] org.apache.spark.memory.TaskMemoryManager - leak a page: org.apache.spark.unsafe.memory.MemoryBlock@22316bec in task 88341 WARN [Executor task launch worker-0] org.apache.spark.executor.Executor - Managed memory leak detected; size = 17039360 bytes, TID = 88341 Thanks, Ivan -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Memory-leak-warnings-in-Spark-2-0-1-tp19424.html Sent from the Apache Spark Developers List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org