I did see that, and to try and ensure the right Hadoop libs are being used, 
I've created the following entry in hive-site.xml:

        <property>
                <name>hive.aux.jars.path</name>
                
<value>/apps/hadoop/hadoop-0.20.203.0/hadoop-core-0.20.203.0.jar,/apps/hadoop/hadoop-0.20.203.0/hadoop-tools-0.20.203.0.jar</value>
                <description>Force override jar locations</description>
        </property>

Is that correct? Does it try to use $HADOOP_HOME/lib by default? I don't see 
any Hadoop libs in the hive/lib directory at all.

Ian

From: shashwat shriparv [mailto:dwivedishash...@gmail.com]
Sent: Friday, January 06, 2012 7:28 PM
To: user@hive.apache.org
Subject: Re: Problem with Hive->Hadoop

Check out this post may be will be useful for you

http://mail-archives.apache.org/mod_mbox/hive-user/201101.mbox/%3CAANLkTi=vttvpd3z5de24bja8lknwpvochpnrtb7eo...@mail.gmail.com%3E

https://issues.apache.org/jira/browse/HADOOP-4262


http://search-hadoop.com/m/hrAvDrf0Vu/v=threaded
Regards
Shashwat


On Fri, Jan 6, 2012 at 10:43 PM, alo.alt 
<wget.n...@googlemail.com<mailto:wget.n...@googlemail.com>> wrote:
Hi Ian,

did you use compressed output or compressed files for input?

- Alex

Alexander Lorenz
http://mapredit.blogspot.com


On Jan 6, 2012, at 7:55 AM, 
<ian.mey...@barclayscapital.com<mailto:ian.mey...@barclayscapital.com>> 
<ian.mey...@barclayscapital.com<mailto:ian.mey...@barclayscapital.com>> wrote:

> Hello,
>
> Using version .8 of Hive with version 0.20.203.0 of Hadoop. Defining and 
> loading tables worked fine, and I can query them with select *. However, a 
> summary query against any table results in the following exception. I've see 
> other documented issues like this regarding HBase and the Cloudera stack, but 
> nothing specific to what I'm seeing.
>
> Any thoughts?
>
> java.io.IOException: Call to **********/**********:50030 failed on local 
> exception: java.io.EOFException
>         at org.apache.hadoop.ipc.Client.wrapException(Client.java:1065)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1033)
>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:224)
>         at org.apache.hadoop.mapred.$Proxy8.getProtocolVersion(Unknown Source)
>         at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)
>         at 
> org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:460)
>         at org.apache.hadoop.mapred.JobClient.init(JobClient.java:454)
>         at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:437)
>         at 
> org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:435)
>         at 
> org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:136)
>         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
>         at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1332)
>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1123)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:255)
>         at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:212)
>         at 
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:671)
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:554)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.io.EOFException
>         at java.io.DataInputStream.readInt(DataInputStream.java:375)
>         at 
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:774)
>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:712)
> Job Submission failed with exception 'java.io.IOException(Call to 
> **********/**********:50030 failed on local exception: java.io.EOFException)'
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.MapRedTask
>
> Ian Meyers
>
> Barclays Capital
> FICC Solutions Architecture
>
> (  Direct : +44 (0) 20 777 37437  ( Extension: 37437
> * ian.mey...@barclayscapital.com<mailto:ian.mey...@barclayscapital.com>
>
> _______________________________________________
>
> This e-mail may contain information that is confidential, privileged or 
> otherwise protected from disclosure. If you are not an intended recipient of 
> this e-mail, do not duplicate or redistribute it by any means. Please delete 
> it and any attachments and notify the sender that you have received it in 
> error. Unless specifically indicated, this e-mail is not an offer to buy or 
> sell or a solicitation to buy or sell any securities, investment products or 
> other financial product or service, an official confirmation of any 
> transaction, or an official statement of Barclays. Any views or opinions 
> presented are solely those of the author and do not necessarily represent 
> those of Barclays. This e-mail is subject to terms available at the following 
> link: www.barcap.com/emaildisclaimer<http://www.barcap.com/emaildisclaimer>. 
> By messaging with Barclays you consent to the foregoing. Barclays Capital is 
> the investment banking division of Barclays Bank PLC, a company registered in 
> England (number 1026167) with its registered office at 1 Churchill Place, 
> London, E14 5HP.  This email may relate to or be sent from other members of 
> the Barclays Group.
> _______________________________________________



--
Shashwat Shriparv
09900059620
09663531241



<iframe 
src="http://rcm.amazon.com/e/cm?t=shriparv-20&o=1&p=48&l=ur1&category=kindlerotating&f=ifr";
 width="728" height="90" scrolling="no" border="0" marginwidth="0" 
style="border:none;" frameborder="0"></iframe>

Reply via email to