Hi all,
We also encountered these exceptions when integrated Spark 3.0.1 with hive
2.1.1-cdh6.1.0 and hbase 2.1.0-cdh-6.1.0.
Does anyone have some ideas to solve these exceptions?
Thanks in advance.
Best.
Michael Yang
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
Hi Pradyumn,
We integrated Spark 3.0.1 with hive 2.1.1-cdh6.1.0 and it works fine to use
spark-sql to query hive tables.
Make sure you config spark-defaults.conf and spark-env.sh well and copy
hive/hadoop related config files to spark conf folder.
You can refer to below refrences for detail.
ht
Hi Pradyumn,
It seems you did not configure spark-default.conf file well.
Below configurations are needed to use hive 2.1.1 as metastore and execution
engine.
spark.sql.hive.metastore.version=2.1.1
spark.sql.hive.metastore.jars=/opt/cloudera/parcels/CDH/lib/hive/lib/*
Thanks.
Michael Yang
--