Issue was resolved by upgrading Spark to version 1.6
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-heap-space-out-of-memory-tp27050p27131.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---
What version of Spark are you running?
Do you see the heap space slowly increase over time?
Have you set the ttl cleaner?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Processing-Time-Spikes-Spark-Streaming-tp22375p27130.html
Sent from the Apache Spark
Hi All,
We have a spark streaming v1.4/java 8 application that slows down and
eventually runs out of heap space. The less driver memory, the faster it
happens.
Appended is our spark configuration and a snapshot of the of heap taken
using jmap on the driver process. The RDDInfo, $colon$colon and [