Hi,
I'm using spark 1.4.1 and i have a simple application that create a dstream
that read data from kafka and apply a filter transformation on it. After
more or less a day throw the following exception:
/Exception in thread "dag-scheduler-event-loop" java.lang.OutOfMemoryError:
Java heap space
Hi,
I'm using spark 1.4.1 and i have a simple application that create a dstream
that read data from kafka and apply a filter transformation on it. After
more or less a day throw the following exception:
/Exception in thread "dag-scheduler-event-loop" java.lang.OutOfMemoryError:
Java heap space
What you mean? I 've pasted the output in the same format used by spark...
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/OOM-Exception-in-my-spark-streaming-application-tp26479p26483.html
Sent from the Apache Spark User List mailing list archive at Nabble.