ok.Is there no way to specify it in code, when I create SparkConf ?
From: Todd Nist
Sent: Friday, February 6, 2015 10:08 PM
To: Ashutosh Trivedi (MT2013030)
Cc: user@spark.apache.org
Subject: Re: Link existing Hive to Spark
You can always just add the entry
e.xml
> there?
>
> If I build Spark from source code , I can put the file in conf/ but I am
> avoiding that.
> --
> *From:* Todd Nist
> *Sent:* Friday, February 6, 2015 8:32 PM
> *To:* Ashutosh Trivedi (MT2013030)
> *Cc:* user@spark.apache.org
&
-list.1001560.n3.nabble.com/Link-existing-Hive-to-Spark-tp21531.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail:
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@sp
pache-spark-user-list.1001560.n3.nabble.com/Link-existing-Hive-to-Spark-tp21531.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr..
current directory./
So I have existing hive set up and configured, how would I be able to use
the same in Spark?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Link-existing-Hive-to-Spark-tp21531.html
Sent from the Apache Spark User List mailing list archive at