Hi Konstantin,
Thanks for reporting this. This happens because there are null keys in your
data. In general, Spark should not throw null pointer exceptions, so this
is a bug. I have fixed this here: https://github.com/apache/spark/pull/1288.
For now, you can workaround this by special-handling yo
Hi all,
I catch very confusing exception running Spark 1.0 on HDP2.1
During save rdd as text file I got:
14/07/02 10:11:12 WARN TaskSetManager: Loss was due to
java.lang.NullPointerException
java.lang.NullPointerException
at
org.apache.spark.util.collection.ExternalAppendOnlyMap$Externa