Try using --files /path/of/hive-site.xml  in spark-submit and run.

On Thursday 18 August 2016 05:26 PM, Diwakar Dhanuskodi wrote:
Hi

Can you cross check by providing same library path in --jars of spark-submit and run .


Sent from Samsung Mobile.


-------- Original message --------
From: "颜发才(Yan Facai)" <yaf...@gmail.com>
Date:18/08/2016 15:17 (GMT+05:30)
To: "user.spark" <user@spark.apache.org>
Cc:
Subject: [Spark 2.0] ClassNotFoundException is thrown when using Hive

Hi, all.

I copied hdfs-site.xml, core-site.xml and hive-site.xml to $SPARK_HOME/conf. And spark-submit is used to submit task to yarn, and run as **client** mode.
However, ClassNotFoundException is thrown.

some details of logs are list below:
```
16/08/12 17:07:32 INFO hive.HiveUtils: Initializing HiveMetastoreConnection version 0.13.1 using file:/data0/facai/lib/hive-0.13.1/lib:file:/data0/facai/lib/hadoop-2.4.1/share/hadoop 16/08/12 17:07:32 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError: org/apache/hadoop/hive/ql/session/SessionState when creating Hive client using classpath: file:/data0/facai/lib/hive-0.13.1/lib, file:/data0/facai/lib/hadoop-2.4.1/share/hadoop
```

In fact, all the jars needed by hive is  in the directory:
```Bash
[hadoop@h107713699 spark_test]$ ls /data0/facai/lib/hive-0.13.1/lib/ | grep hive
hive-ant-0.13.1.jar
hive-beeline-0.13.1.jar
hive-cli-0.13.1.jar
hive-common-0.13.1.jar
...
```

So, my question is:
why spark cannot find the jars needed?

Any help will be appreciate, thanks.




Reply via email to