Re: Link existing Hive to Spark

2015-02-06 Thread Ashutosh Trivedi (MT2013030)
ok.Is there no way to specify it in code, when I create SparkConf ? From: Todd Nist Sent: Friday, February 6, 2015 10:08 PM To: Ashutosh Trivedi (MT2013030) Cc: user@spark.apache.org Subject: Re: Link existing Hive to Spark You can always just add the entry

Re: Link existing Hive to Spark

2015-02-06 Thread Todd Nist
e.xml > there? > > If I build Spark from source code , I can put the file in conf/ but I am > avoiding that. > -- > *From:* Todd Nist > *Sent:* Friday, February 6, 2015 8:32 PM > *To:* Ashutosh Trivedi (MT2013030) > *Cc:* user@spark.apache.org &

Re: Link existing Hive to Spark

2015-02-06 Thread Ashutosh Trivedi (MT2013030)
go in ivy2 and put hive-site.xml there? If I build Spark from source code , I can put the file in conf/ but I am avoiding that. From: Todd Nist Sent: Friday, February 6, 2015 8:32 PM To: Ashutosh Trivedi (MT2013030) Cc: user@spark.apache.org Subject: Re: Link exi

Re: Link existing Hive to Spark

2015-02-06 Thread Todd Nist
Hi Ashu, Per the documents: Configuration of Hive is done by placing your hive-site.xml file in conf/. For example, you can place a something like this in your $SPARK_HOME/conf/hive-site.xml file: hive.metastore.uris ** thrift://*HostNameHere*:9083 URI for client to contact metastore