[ 
https://issues.apache.org/jira/browse/HIVE-13651?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15296662#comment-15296662
 ] 

Sergey Shelukhin commented on HIVE-13651:
-----------------------------------------

[~jdere] [~sseth] how are these credentials normally provided to a Tez job? Is 
this about HDFS/HBase/etc tokens - I assume HS2 issues those, right? 
How does SparkSQL currently get HDFS access to read directly from Hive (or 
HBase, for example)

> LlapBaseInputFormat: figure out where credentials come from
> -----------------------------------------------------------
>
>                 Key: HIVE-13651
>                 URL: https://issues.apache.org/jira/browse/HIVE-13651
>             Project: Hive
>          Issue Type: Sub-task
>          Components: llap
>            Reporter: Jason Dere
>
> todo in LlapBaseInputFormat.constructSubmitWorkRequestProto()
> {code}
>     // TODO Figure out where credentials will come from. Normally Hive sets up
>     // URLs on the tez dag, for which Tez acquires credentials.
>     //    taskCredentials.addAll(getContext().getCredentials());
>     //    
> Preconditions.checkState(currentQueryIdentifierProto.getDagIdentifier() ==
>     //        
> taskSpec.getTaskAttemptID().getTaskID().getVertexID().getDAGId().getId());
>     //    ByteBuffer credentialsBinary = 
> credentialMap.get(currentQueryIdentifierProto);
>     //    if (credentialsBinary == null) {
>     //      credentialsBinary = 
> serializeCredentials(getContext().getCredentials());
>     //      credentialMap.putIfAbsent(currentQueryIdentifierProto, 
> credentialsBinary.duplicate());
>     //    } else {
>     //      credentialsBinary = credentialsBinary.duplicate();
>     //    }
>     //    
> builder.setCredentialsBinary(ByteString.copyFrom(credentialsBinary));
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to