Hi Sean,
Below is the sbt dependencies that I am using.

I gave another try by removing the "provided" keyword which failed with the
same error.
What confuses me is that the stack trace appears after few of the stages
have already run completely.

  object V {
    val spark = "1.2.0-cdh5.3.0"
    val esriGeometryAPI = "1.2"
    val csvWriter = "1.0.0"
    val hadoopClient = "2.5.0"
    val scalaTest = "2.2.1"
    val jodaTime = "1.6.0"
    val scalajHTTP = "1.0.1"
    val avro       = "1.7.7"
    val scopt      = "3.2.0"
    val breeze     = "0.8.1"
    val config     = "1.2.1"
  }
  object Libraries {
    val EEAMessage      = "com.waterloopublic" %% "eeaformat" %
"1.0-SNAPSHOT"
    val avro                    = "org.apache.avro" % "avro-mapred" %
V.avro classifier "hadoop2"
    val spark                  = "org.apache.spark" % "spark-core_2.10" %
V.spark % "provided"
    val hadoopClient        = "org.apache.hadoop" % "hadoop-client" %
V.hadoopClient % "provided"
    val esriGeometryAPI  = "com.esri.geometry" % "esri-geometry-api" %
V.esriGeometryAPI
    val scalaTest             = "org.scalatest" %% "scalatest" %
V.scalaTest % "test"
    val csvWriter              = "com.github.tototoshi" %% "scala-csv" %
V.csvWriter
    val jodaTime               = "com.github.nscala-time" %% "nscala-time"
% V.jodaTime % "provided"
    val scalajHTTP            = "org.scalaj" %% "scalaj-http" % V.scalajHTTP
    val scopt                    = "com.github.scopt" %% "scopt" % V.scopt
    val breeze          = "org.scalanlp" %% "breeze" % V.breeze
    val breezeNatives   = "org.scalanlp" %% "breeze-natives" % V.breeze
    val config          = "com.typesafe" % "config" % V.config
  }

There are only few more things to try(like reverting back to Spark 1.1)
before I run out of idea completely.
Please share your insights.

..Manas

On Wed, Mar 11, 2015 at 9:44 AM, Sean Owen <so...@cloudera.com> wrote:

> This usually means you are mixing different versions of code. Here it
> is complaining about a Spark class. Are you sure you built vs the
> exact same Spark binaries, and are not including them in your app?
>
> On Wed, Mar 11, 2015 at 1:40 PM, manasdebashiskar
> <manasdebashis...@gmail.com> wrote:
> > (This is a repost. May be a simpler subject will fetch more attention
> among
> > experts)
> >
> > Hi,
> >  I have a CDH5.3.2(Spark1.2) cluster.
> >  I am getting an local class incompatible exception for my spark
> application
> > during an action.
> > All my classes are case classes(To best of my knowledge)
> >
> > Appreciate any help.
> >
> > Exception in thread "main" org.apache.spark.SparkException: Job aborted
> due
> > to stage failure: Task 0 in stage 3.0 failed 4 times, most recent
> failure:
> > Lost task 0.3 in stage 3.0 (TID 346, datanode02):
> > java.io.InvalidClassException: org.apache.spark.rdd.PairRDDFunctions;
> local
> > class incompatible:stream classdesc serialVersionUID =
> 8789839749593513237,
> > local class serialVersionUID = -4145741279224749316
> > at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> > at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
> > at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> > at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> > at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> > at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> > at
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> > at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> > at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> > at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> > at
> >
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:62)
> > at
> >
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:87)
> > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:57)
> > at org.apache.spark.scheduler.Task.run(Task.scala:56)
> > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196)
> > at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > at java.lang.Thread.run(Thread.java:745)
> >
> >
> > Thanks
> > Manas
> > Manas Kar
> >
> > ________________________________
> > View this message in context: PairRDD serialization exception
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to