You can follow along to what I do here.

https://github.com/edwardcapriolo/hive_test

Essentially hive requires a HADOOP_HOME because it always wants to fork a
bin/hadoop process. Hive-test helps you unpack hadoop inside target and
change your hadoop_home to some other directory.

It would be nice if there was some other way to do this.


On Fri, Dec 27, 2013 at 10:27 PM, Jay Vyas <jayunit...@gmail.com> wrote:

> Hi Hive:
>
> I'm attempting to create a robust eclipse based dev environment for
> testing my hive jobs in localmode however I run into classnotfound errors
> depending on which maven dependencies I use. Also, it seems when I change
> these dependencies from hive 0.12 to hive 0.11, I get other errors related
> to hive trying to launch jobs via calling /usr/bin/hadoop.
>
> This I am stuck: I can't run hive 12 in local java mode because of subtle
> datanucleus class and API inconsistencies which are tough to resolve, and
> when going to hive 11, it seems local mode is not natively detected via the
> jdbc URL...
>
> So I have 2 questions:
>
> 0) how does hive 12 versus 11 implement local mode differently ?
>
> And
>
> 1) What is the right way to in hive in pure java/ local environments?
>
> The hive book suggests modifying configuration properties, for local mode..
>
> but I also have found  that in hive 0.12 , using the jdbc://hive
> connection URL automagically launches jobs in local mode..
>
> However in 0.11 , I see calls to /usr/bin/hadoop when running java classes
> in local eclipse environment.
>
> Thanks!
>
> FYI to see an example of my pom.xml, you can checkout the
> github://jayunit100/bigpetstore pom.xml file.
>

Reply via email to