Hi, I had the same problem.
One option (starting with Spark 1.2, which is currently in preview) is to
use the Avro library for Spark SQL.
Other is using Kryo Serialization.
by default spark uses Java Serialization, you can specify kryo serialization
while creating spark context.
val conf = new S
Hi, I had the same problem.
One option (starting with Spark 1.2, which is currently in preview) is to
use the Avro library for Spark SQL.
Other is using Kryo Serialization.
by default spark uses Java Serialization, you can specify kryo
serialization while creating spark context.
val conf = new S
I did not encounter this with my Avro records using Spark 1.10 (see
https://github.com/medale/spark-mail/blob/master/analytics/src/main/scala/com/uebercomputing/analytics/basic/UniqueSenderCounter.scala).
I do use the default Java serialization but all the fields in my Avro
object are Seriali
Yeah, I have the same problem with 1.1.0, but not 1.0.0.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/java-io-NotSerializableException-org-apache-avro-mapred-AvroKey-using-spark-with-avro-tp15165p20752.html
Sent from the Apache Spark User List mailing li