Hi, I had the same problem.

One option (starting with Spark 1.2, which is currently in preview) is to
use the Avro library for Spark SQL.

Other is using Kryo Serialization.
by default spark uses Java Serialization, you can specify kryo serialization
while creating spark context.

val conf = new SparkConf().set("spark.serializer",
"org.apache.spark.serializer.KryoSerializer")
val sc = new SparkContext(conf)

This worked for me.

Regards,
Anish



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/java-io-NotSerializableException-org-apache-avro-mapred-AvroKey-using-spark-with-avro-tp15165p20761.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to