Re: Link existing Hive to Spark

2015-02-06 Thread Ashutosh Trivedi (MT2013030)
ok.Is there no way to specify it in code, when I create SparkConf ? From: Todd Nist Sent: Friday, February 6, 2015 10:08 PM To: Ashutosh Trivedi (MT2013030) Cc: user@spark.apache.org Subject: Re: Link existing Hive to Spark You can always just add the entry

Re: Link existing Hive to Spark

2015-02-06 Thread Todd Nist
e.xml > there? > > If I build Spark from source code , I can put the file in conf/ but I am > avoiding that. > -- > *From:* Todd Nist > *Sent:* Friday, February 6, 2015 8:32 PM > *To:* Ashutosh Trivedi (MT2013030) > *Cc:* user@spark.apache.org &

Re: Link existing Hive to Spark

2015-02-06 Thread Ashutosh Trivedi (MT2013030)
-list.1001560.n3.nabble.com/Link-existing-Hive-to-Spark-tp21531.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@sp

Re: Link existing Hive to Spark

2015-02-06 Thread Todd Nist
pache-spark-user-list.1001560.n3.nabble.com/Link-existing-Hive-to-Spark-tp21531.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr..

Link existing Hive to Spark

2015-02-06 Thread ashu
current directory./ So I have existing hive set up and configured, how would I be able to use the same in Spark? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Link-existing-Hive-to-Spark-tp21531.html Sent from the Apache Spark User List mailing list archive at