But Hive 1.2.1 do not have hive-site.xml, I tried to add my own which
causes me other several issues. On the other side it works well for me with
 Hive 2.0.1 where hive-site.xml content were as below and copied to
spark/conf too. it worked.

*5. hive-site.xml configuration setup*


Add below at conf/hive-site.xml , if not there then create it.


<property>

<name>javax.jdo.option.ConnectionURL</name>

<value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value>

<description>metadata is stored in a MySQL server</description>

</property>

<property>

<name>javax.jdo.option.ConnectionDriverName</name>

<value>com.mysql.jdbc.Driver</value>

<description>MySQL JDBC driver class</description>

</property>

<property>

<name>javax.jdo.option.ConnectionUserName</name>

<value>hiveuser</value>

<description>user name for connecting to mysql server</description>

</property>

<property>

<name>javax.jdo.option.ConnectionPassword</name>

<value>hivepassword</value>

<description>password for connecting to mysql server</description>

</property>


Replace below 3 properties tag with whatever already exist by default.
otherwise it will throw an error


"java.net.URISyntaxException: Relative path in absolute URI:
${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D"


<property>

<name>hive.querylog.location</name>

<value>$HIVE_HOME/iotmp</value>

<description>Location of Hive run time structured log file</description>

</property>


<property>

<name>hive.exec.local.scratchdir</name>

<value>$HIVE_HOME/iotmp</value>

<description>Local scratch space for Hive jobs</description>

</property>


<property>

<name>hive.downloaded.resources.dir</name>

<value>$HIVE_HOME/iotmp</value>

<description>Temporary local directory for added resources in the remote
file system.</description>

</property>



On Tue, Jan 17, 2017 at 10:01 PM, Dongjoon Hyun <dongj...@apache.org> wrote:

> Hi, Chetan.
>
> Did you copy your `hive-site.xml` into Spark conf directory? For example,
>
> cp /usr/local/hive/conf/hive-site.xml /usr/local/spark/conf
>
> If you want to use the existing Hive metastore, you need to provide that
> information to Spark.
>
> Bests,
> Dongjoon.
>
> On 2017-01-16 21:36 (-0800), Chetan Khatri <chetan.opensou...@gmail.com>
> wrote:
> > Hello,
> >
> > I have following services are configured and installed successfully:
> >
> > Hadoop 2.7.x
> > Spark 2.0.x
> > HBase 1.2.4
> > Hive 1.2.1
> >
> > *Installation Directories:*
> >
> > /usr/local/hadoop
> > /usr/local/spark
> > /usr/local/hbase
> >
> > *Hive Environment variables:*
> >
> > #HIVE VARIABLES START
> > export HIVE_HOME=/usr/local/hive
> > export PATH=$PATH:$HIVE_HOME/bin
> > #HIVE VARIABLES END
> >
> > So, I can access Hive from anywhere as environment variables are
> > configured. Now if if i start my spark-shell & hive from location
> > /usr/local/hive then both work good for hive-metastore other wise from
> > where i start spark-shell where spark creates own meta-store.
> >
> > i.e I am reading from HBase and Writing to Hive using Spark. I dont know
> > why this is weird issue is.
> >
> >
> >
> >
> > Thanks.
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to