In order for the Spark to see Hive metastore you need to build Spark Session accordingly:
val spark = SparkSession.builder() .master("local[2]") .appName("myApp") .config("hive.metastore.uris","thrift://localhost:9083") .enableHiveSupport() .getOrCreate() On Mon, Nov 12, 2018 at 11:49 AM Ирина Шершукова <irinawerwuk...@gmail.com> wrote: > > > hello guys, spark2.1.0 couldn’t connect to existing Hive metastore. > > > > > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org