Hello Akhil

I use Spark 1.4.2 on HDP 2.1(Hadoop 2.4)

I didn't use --driver-class-path. I only use
spark.executor.userClassPathFirst=true


Kyle


2015-08-14 17:11 GMT+08:00 Akhil Das <ak...@sigmoidanalytics.com>:

> Which version of spark are you using? Did you try with --driver-class-path
> configuration?
>
> Thanks
> Best Regards
>
> On Fri, Aug 14, 2015 at 2:05 PM, Kyle Lin <kylelin2...@gmail.com> wrote:
>
>> Hi all
>>
>> I had similar usage and also got the same problem.
>>
>> I guess Spark use some class in my user jars but actually it should use
>> the class in spark-assembly-xxx.jar, but I don't know how to fix it.
>>
>> Kyle
>>
>>
>>
>> 2015-07-22 23:03 GMT+08:00 Ashish Soni <asoni.le...@gmail.com>:
>>
>>> Hi All ,
>>>
>>> I am getting below error when i use the --conf
>>> spark.files.userClassPathFirst=true parameter
>>>
>>> Job aborted due to stage failure: Task 3 in stage 0.0 failed 4 times,
>>> most recent failure: Lost task 3.3 in stage 0.0 (TID 32, 10.200.37.161):
>>> java.lang.ClassCastException: cannot assign instance of scala.None$ to
>>> field org.apache.spark.scheduler.Task.metrics of type scala.Option in
>>> instance of org.apache.spark.scheduler.ResultTask
>>>
>>> I am using as below
>>>
>>> spark-submit --conf spark.files.userClassPathFirst=true --driver-memory
>>> 6g --executor-memory 12g --executor-cores 4   --class
>>> com.ericsson.engine.RateDriver --master local
>>> /home/spark/workspace/simplerating/target/simplerating-0.0.1-SNAPSHOT.jar
>>> spark://eSPARKMASTER:7077 hdfs://enamenode/user/spark
>>>
>>> thanks
>>>
>>
>>
>

Reply via email to