In addition to Cheng's comment --
I found the similar problem when hive-site.xml is not in the class path. A
proper stack trace can pinpoint the problem.
In the mean time, you can add it into your environment through
HADOOP_CLASSPATH. (export HADOOP_CONF_DIR=/etc/hive/conf/)
See more at
http:
If you don't need to interact with Hive, you may compile Spark without
using the -Phive flag to eliminate Hive dependencies. In this way, the
sqlContext instance in Spark shell will be of type SQLContext instead of
HiveContext.
The reason behind the Hive metastore error is probably due to Hive
Hello,
sqlContext.parquetFile(dir)
throws exception " Unable to instantiate
org.apache.hadoop.hive.metastore.HiveMetaStoreClient"
The strange thing is that on the second attempt to open the file it is
successful:
try {
sqlContext.parquetFile(dir)
} catch {
case e: Exception => sqlCont