After upgrading to Spark 1.0.0, I get this error:

 ERROR org.apache.spark.executor.ExecutorUncaughtExceptionHandler -
Uncaught exception in thread Thread[Executor task launch
worker-2,5,main]
java.lang.IncompatibleClassChangeError: Found interface
org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected

I thought this was caused by a dependency on Hadoop 1.0.4 (even though
I downloaded the Spark 1.0.0 for Hadoop 2), but I can't seem to fix
it.  Any advice?

Reply via email to