You don't need to copy these jars manually, just specify them in the
interpreter setting page.

On Wed, Aug 31, 2016 at 9:52 PM, Abhi Basu <9000r...@gmail.com> wrote:

> Where do these jars have to be placed?
>
> I thought copying the hive-site.xml and pointing to hadoop conf folder in
> zeppelin conf should be enough (like before).
>
> Thanks,
>
> Abhi
>
> On Tue, Aug 30, 2016 at 6:59 PM, Jeff Zhang <zjf...@gmail.com> wrote:
>
>> You need add the following 2 dependencies in the interpreter setting
>> page.
>>
>> https://zeppelin.apache.org/docs/0.6.1/interpreter/hive.html#dependencies
>>
>> org.apache.hive:hive-jdbc:0.14.0
>> org.apache.hadoop:hadoop-common:2.6.0
>>
>>
>> On Wed, Aug 31, 2016 at 2:39 AM, Abhi Basu <9000r...@gmail.com> wrote:
>>
>>> Folks:
>>>
>>> Seems like a config issue.
>>>
>>> 1. Copied hive-site.xml into /ZEPP_HOME/conf folder
>>> 2. Added following to config file:
>>>
>>> export JAVA_HOME=/...../...export HADOOP_CONF_DIR=/etc/hadoop/conf
>>>
>>>
>>> I am using Zeppelin after a while, and looks like Hive interpreter is
>>> part of JDBC interpreter now.
>>> Interpreter properties seem to be set correctly:
>>> PropertyValue
>>> hive.driver org.apache.hive.jdbc.HiveDriver
>>> hive.url jdbc:hive2://localhost:10000
>>> hive.user hiveUser
>>> hive.password hivePassword
>>>
>>> When I run %hive from Zeppelin, I get a hive jdbc driver not found
>>> error. How do I fix this? Also, how do I configure for Impala within the
>>> JDBC section of interpreters.
>>>
>>> Thanks,
>>>
>>> Abhi
>>>
>>> --
>>> Abhi Basu
>>>
>>
>>
>>
>> --
>> Best Regards
>>
>> Jeff Zhang
>>
>
>
>
> --
> Abhi Basu
>



-- 
Best Regards

Jeff Zhang

Reply via email to