[ 
https://issues.apache.org/jira/browse/HIVE-5523?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13971876#comment-13971876
 ] 

Sushanth Sowmyan commented on HIVE-5523:
----------------------------------------

I'm canceling this patch because I feel there is still more code cleanup 
requried here to make things more obvious, and code cleanup is now the point of 
this patch, since the original reported issue is working without this patch. 
HIVE-6915 fixes adding delegation token for Tez, and in doing so, opens up 
questions on whether how it's done currently is working because it 
happens-to-work, or because that's the way it's supposed to, and I'm leaning 
towards the former.

> HiveHBaseStorageHandler should pass kerbros credentials down to HBase
> ---------------------------------------------------------------------
>
>                 Key: HIVE-5523
>                 URL: https://issues.apache.org/jira/browse/HIVE-5523
>             Project: Hive
>          Issue Type: Bug
>          Components: HBase Handler
>    Affects Versions: 0.11.0
>            Reporter: Nick Dimiduk
>            Assignee: Sushanth Sowmyan
>         Attachments: HIVE-5523.patch, Task Logs_ 
> 'attempt_201310110032_0023_r_000000_0'.html
>
>
> Running on a secured cluster, I have an HBase table defined thusly
> {noformat}
> CREATE TABLE IF NOT EXISTS pagecounts_hbase (rowkey STRING, pageviews STRING, 
> bytes STRING)
> STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
> WITH SERDEPROPERTIES ('hbase.columns.mapping' = ':key,f:c1,f:c2')
> TBLPROPERTIES ('hbase.table.name' = 'pagecounts');
> {noformat}
> and a query to populate that table
> {noformat}
> -- ensure hbase dependency jars are shipped with the MR job
> SET hive.aux.jars.path = 
> file:///etc/hbase/conf/hbase-site.xml,file:///usr/lib/hive/lib/hive-hbase-handler-0.11.0.1.3.2.0-111.jar,file:///usr/lib/hbase/hbase-0.94.6.1.3.2.0-111-security.jar,file:///usr/lib/zookeeper/zookeeper-3.4.5.1.3.2.0-111.jar;
> -- populate our hbase table
> FROM pgc INSERT INTO TABLE pagecounts_hbase SELECT pgc.* WHERE rowkey LIKE 
> 'en/q%' LIMIT 10;
> {noformat}
> The reduce tasks fail with what boils down to the following exception:
> {noformat}
> Caused by: java.lang.RuntimeException: SASL authentication failed. The most 
> likely cause is missing or invalid credentials. Consider 'kinit'.
>       at 
> org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection$1.run(SecureClient.java:263)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:396)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>       at java.lang.reflect.Method.invoke(Method.java:597)
>       at org.apache.hadoop.hbase.util.Methods.call(Methods.java:37)
>       at org.apache.hadoop.hbase.security.User.call(User.java:590)
>       at org.apache.hadoop.hbase.security.User.access$700(User.java:51)
>       at 
> org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:444)
>       at 
> org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.handleSaslConnectionFailure(SecureClient.java:224)
>       at 
> org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.setupIOstreams(SecureClient.java:313)
>       at 
> org.apache.hadoop.hbase.ipc.HBaseClient.getConnection(HBaseClient.java:1124)
>       at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:974)
>       at 
> org.apache.hadoop.hbase.ipc.SecureRpcEngine$Invoker.invoke(SecureRpcEngine.java:104)
>       at $Proxy10.getProtocolVersion(Unknown Source)
>       at 
> org.apache.hadoop.hbase.ipc.SecureRpcEngine.getProxy(SecureRpcEngine.java:146)
>       at org.apache.hadoop.hbase.ipc.HBaseRPC.waitForProxy(HBaseRPC.java:208)
>       at 
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1346)
>       at 
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1305)
>       at 
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1292)
>       at 
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1001)
>       at 
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:896)
>       at 
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:998)
>       at 
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:900)
>       at 
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:857)
>       at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:234)
>       at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:174)
>       at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:133)
>       at 
> org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat.getHiveRecordWriter(HiveHBaseTableOutputFormat.java:83)
>       at 
> org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:250)
>       at 
> org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:237)
>       ... 17 more
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to