I’ve read several discussions of the error here and so have wiped all cluster 
machines and copied the master’s spark build to the rest of the cluster. I’ve 
built my job on the master using the correct Spark version as a dependency and 
even build that version of Spark. I still get the incompatible serialVersionUID 
error.

If I run the job locally with master = local[8] it completes fine.

I thought I had incompatible builds but in the end I’m not quite sure what this 
error is telling me

14/10/16 15:21:03 WARN scheduler.TaskSetManager: Loss was due to 
java.io.InvalidClassException
java.io.InvalidClassException: org.apache.spark.rdd.RDD; local class 
incompatible: stream classdesc serialVersionUID = 385418487991259089, local 
class serialVersionUID = -6766554341038829528
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:560)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1599)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1494)
        at 
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1599)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1494)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1748)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
        at 
org.apache.spark.scheduler.ShuffleMapTask$.deserializeInfo(ShuffleMapTask.scala:63)
        at 
org.apache.spark.scheduler.ShuffleMapTask.readExternal(ShuffleMapTask.scala:135)
        at 
java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1814)
        at 
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1773)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
        at 
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
        at 
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:85)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:165)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
        at java.lang.Thread.run(Thread.java:662)


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to