ah, that explains it, many thanks!
On Sat, May 16, 2015 at 7:41 PM, Yana Kadiyska
wrote:
> oh...metastore_db location is not controlled by
> hive.metastore.warehouse.dir -- one is the location of your metastore DB,
> the other is the physical location of your stored data. Checkout this SO
> thre
oh...metastore_db location is not controlled by
hive.metastore.warehouse.dir -- one is the location of your metastore DB,
the other is the physical location of your stored data. Checkout this SO
thread:
http://stackoverflow.com/questions/13624893/metastore-db-created-wherever-i-run-hive
On Sat, M
Gave it another try - it seems that it picks up the variable and prints out
the correct value, but still puts the metatore_db folder in the current
directory, regardless.
On Sat, May 16, 2015 at 1:13 PM, Tamas Jambor wrote:
> Thank you for the reply.
>
> I have tried your experiment, it seems th
Thank you for the reply.
I have tried your experiment, it seems that it does not print the settings
out in spark-shell (I'm using 1.3 by the way).
Strangely I have been experimenting with an SQL connection instead, which
works after all (still if I go to spark-shell and try to print out the SQL
s
My point was more to how to verify that properties are picked up from
the hive-site.xml file. You don't really need hive.metastore.uris if you're
not running against an external metastore. I just did an experiment with
warehouse.dir.
My hive-site.xml looks like this:
hive.metastore
thanks for the reply. I am trying to use it without hive setup
(spark-standalone), so it prints something like this:
hive_ctx.sql("show tables").collect()
15/05/15 17:59:03 INFO HiveMetaStore: 0: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
15/05/15 17:59
This should work. Which version of Spark are you using? Here is what I do
-- make sure hive-site.xml is in the conf directory of the machine you're
using the driver from. Now let's run spark-shell from that machine:
scala> val hc= new org.apache.spark.sql.hive.HiveContext(sc)
hc: org.apache.spark.
I have tried to put the hive-site.xml file in the conf/ directory with,
seems it is not picking up from there.
On Thu, May 14, 2015 at 6:50 PM, Michael Armbrust
wrote:
> You can configure Spark SQLs hive interaction by placing a hive-site.xml
> file in the conf/ directory.
>
> On Thu, May 14, 2
You can configure Spark SQLs hive interaction by placing a hive-site.xml
file in the conf/ directory.
On Thu, May 14, 2015 at 10:24 AM, jamborta wrote:
> Hi all,
>
> is it possible to set hive.metastore.warehouse.dir, that is internally
> create by spark, to be stored externally (e.g. s3 on aws