To follow up on this one, I figured out that the bug resolved here [1] was
the main source of my problems. It basically changed the definition of
NetUtils#getInputStream method causing a "NoSuchMethodError" to be thrown
downstream. I upgraded hive 0.9 to use the hbase jar packaged with CDH4 GA
release (hbase-0.92.1-cdh4.0.0.jar) and everything was good then.

[1] https://issues.apache.org/jira/browse/HADOOP-8350

On Mon, Jul 16, 2012 at 5:08 PM, kulkarni.swar...@gmail.com <
kulkarni.swar...@gmail.com> wrote:

> Yeah. I did override hadoop.security.version to 2.0.0-alpha. That gives me
> a whole bunch of compilation errors in HadoopShimsSecure.java
>
>     [javac]
> /Users/sk018283/git-repo/hive/shims/src/common-secure/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java:37:
> package org.apache.hadoop.mapred does not exist
>     [javac] import org.apache.hadoop.mapred.ClusterStatus;
>     [javac]                                ^
>     [javac]
> /Users/sk018283/git-repo/hive/shims/src/common-secure/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java:38:
> package org.apache.hadoop.mapred does not exist
>     [javac] import org.apache.hadoop.mapred.FileInputFormat;
>     [javac]                                ^
>     [javac]
> /Users/sk018283/git-repo/hive/shims/src/common-secure/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java:39:
> package org.apache.hadoop.mapred does not exist
>     [javac] import org.apache.hadoop.mapred.InputFormat;
>     ....many more......
>
> On Mon, Jul 16, 2012 at 4:48 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> I see the following in build.properties :
>>
>> hadoop.version=${hadoop-0.20.version}
>> hadoop.security.version=${hadoop-0.20S.version}
>>
>> Have you tried to override the above property values when building ?
>>
>> If it still fails, please comment on 
>> HIVE-3029<https://issues.apache.org/jira/browse/HIVE-3029>
>> .
>>
>> Thanks
>>
>>
>> On Mon, Jul 16, 2012 at 2:42 PM, kulkarni.swar...@gmail.com <
>> kulkarni.swar...@gmail.com> wrote:
>>
>>> I found this issue [1] and applied the patch but still the issue
>>> persists.
>>>
>>> Any different way that I should be creating my assembly (currently just
>>> doing "ant clean tar") so that it works with hadoop 2.0.0 on its classpath?
>>>
>>> Any help is appreciated.
>>>
>>> Thanks,
>>>
>>> [1] https://issues.apache.org/jira/browse/HIVE-3029
>>>
>>>
>>> On Fri, Jul 13, 2012 at 10:27 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>>
>>>> See
>>>> https://issues.apache.org/jira/browse/HADOOP-8350?focusedCommentId=13414276&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-13414276
>>>>
>>>> Cheers
>>>>
>>>>
>>>> On Fri, Jul 13, 2012 at 12:38 PM, kulkarni.swar...@gmail.com <
>>>> kulkarni.swar...@gmail.com> wrote:
>>>>
>>>>> Has anyone being using hive 0.9.0 release with the CDH4 GA release? I
>>>>> keep hitting this exception on its interaction with HBase.
>>>>>
>>>>> java.lang.NoSuchMethodError:
>>>>> org.apache.hadoop.net.NetUtils.getInputStream(Ljava/net/Socket;)Ljava/io/InputStream;
>>>>>  at
>>>>> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.setupIOstreams(HBaseClient.java:363)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.HBaseClient.getConnection(HBaseClient.java:1026)
>>>>>  at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:878)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:150)
>>>>>  at $Proxy9.getProtocolVersion(Unknown Source)
>>>>> at
>>>>> org.apache.hadoop.hbase.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:183)
>>>>>  at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:303)
>>>>> at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:280)
>>>>>  at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:332)
>>>>> at
>>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:642)
>>>>>
>>>>> --
>>>>> Swarnim
>>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> Swarnim
>>>
>>
>>
>
>
> --
> Swarnim
>



-- 
Swarnim

Reply via email to