Hi all

I had similar usage and also got the same problem.

I guess Spark use some class in my user jars but actually it should use the
class in spark-assembly-xxx.jar, but I don't know how to fix it.

Kyle



2015-07-22 23:03 GMT+08:00 Ashish Soni <asoni.le...@gmail.com>:

> Hi All ,
>
> I am getting below error when i use the --conf
> spark.files.userClassPathFirst=true parameter
>
> Job aborted due to stage failure: Task 3 in stage 0.0 failed 4 times, most
> recent failure: Lost task 3.3 in stage 0.0 (TID 32, 10.200.37.161):
> java.lang.ClassCastException: cannot assign instance of scala.None$ to
> field org.apache.spark.scheduler.Task.metrics of type scala.Option in
> instance of org.apache.spark.scheduler.ResultTask
>
> I am using as below
>
> spark-submit --conf spark.files.userClassPathFirst=true --driver-memory 6g
> --executor-memory 12g --executor-cores 4   --class
> com.ericsson.engine.RateDriver --master local
> /home/spark/workspace/simplerating/target/simplerating-0.0.1-SNAPSHOT.jar
> spark://eSPARKMASTER:7077 hdfs://enamenode/user/spark
>
> thanks
>

Reply via email to