I used the web ui of spark and could see the conf directory is in CLASSPATH.
An abnormal thing is that when start spark-shell I always get the following
info:
WARN NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
At first, I th
Could you enable HistoryServer and provide the properties and CLASSPATH for the
spark-shell? And 'env' command to list your environment variables?
By the way, what does the spark logs says? Enable debug mode to see what's
going on in spark-shell when it tries to interact and init HiveContext.
Hi, Yin and Andrew, thank you for your reply.
When I create table in hive cli, it works correctly and the table will be
found in hdfs. I forgot start hiveserver2 before and I started it today.
Then I run the command below:
spark-shell --master spark://192.168.40.164:7077 --driver-class-path
co
Another way is to set "hive.metastore.warehouse.dir" explicitly to the HDFS
dir storing Hive tables by using SET command. For example:
hiveContext.hql("SET
hive.metastore.warehouse.dir=hdfs://localhost:54310/user/hive/warehouse")
On Thu, Jul 31, 2014 at 8:05 AM, Andrew Lee wrote:
> Hi All,
>
Hi All,
It has been awhile, but what I did to make it work is to make sure the
followings:
1. Hive is working when you run Hive CLI and JDBC via Hiveserver2
2. Make sure you have the hive-site.xml from above Hive configuration. The
problem here is that you want the hive-site.xml from the Hive
Hi, Michael. I Have the same problem. My warehouse directory is always
created locally. I copied the default hive-site.xml into the
$SPARK_HOME/conf directory on each node. After I executed the code below,
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
hiveContext.hql("CREA
The warehouse and the metastore directories are two different things. The
metastore holds the schema information about the tables and will by default
be a local directory. With javax.jdo.option.ConnectionURL you can
configure it to be something like mysql. The warehouse directory is the
default
Thanks for the response... hive-site.xml is in the classpath so that doesn't
seem to be the issue.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/HiveContext-is-creating-metastore-warehouse-locally-instead-of-in-hdfs-tp10838p10871.html
Sent from the Apache
I ran this before, actually the hive-site.xml works in this way for me (the
tricky happens in the new HiveConf(classOf[SessionState]), can you double check
if hive-site.xml can be loaded in the class path? It supposes to appear in the
root of the class path.
-Original Message-
From: nik