I'd recommend against using the built-in jars for a different version of Hive. You don't need to build your own Spark; just set spark.sql.hive.metastore.jars / spark.sql.hive.metastore.version (see documentation).
On Thu, Nov 9, 2017 at 2:10 AM, yaooqinn <yaooq...@gmail.com> wrote: > Hi, all > The builtin hive version for spark 2.x is hive-1.2.1.spark2, I'd like know > whether it works for hive meta store version 2.1 or not. > > If not, I'd like to build a spark package with -Dhive.version=2.x.spark2 but > find no such a maven artifact there, is there any process to deploy one? > > Or I just need to specify *spark.sql.hive.metastore.jars *to the hive 2.1 > client jars. > > Best Regards! > Kent Yao > > > > -- > Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > -- Marcelo --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org