[ https://issues.apache.org/jira/browse/HIVE-15767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16117956#comment-16117956 ]
Peter Cseh commented on HIVE-15767: ----------------------------------- I don't remember all the details, but here is a longer stack trace:{code} java.lang.RuntimeException: java.io.IOException: Exception reading file:/yarn/nm/usercache/yshi/appcache/application_1485271416004_0001/container_1485271416004_0001_01_000002/container_tokens at org.apache.hadoop.mapreduce.security.TokenCache.mergeBinaryTokens(TokenCache.java:160) at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:138) at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100) at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80) at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:243) {code} Spark does not refer the {{mapreduce.job.credentials.binary}} directly, it is hard-coded in Hadoop's TokenCache [here|https://github.com/apache/hadoop/blob/f67237cbe7bc48a1b9088e990800b37529f1db2a/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/main/java/org/apache/hadoop/mapreduce/security/TokenCache.java#L148]. I think this TokenCache is used by Hadoop's FileSystem implementation too so when Spark talks to HDFS it does through this class. > Hive On Spark is not working on secure clusters from Oozie > ---------------------------------------------------------- > > Key: HIVE-15767 > URL: https://issues.apache.org/jira/browse/HIVE-15767 > Project: Hive > Issue Type: Bug > Components: Spark > Affects Versions: 1.2.1, 2.1.1 > Reporter: Peter Cseh > Assignee: Peter Cseh > Attachments: HIVE-15767-001.patch, HIVE-15767-002.patch, > HIVE-15767.1.patch > > > When a HiveAction is launched form Oozie with Hive On Spark enabled, we're > getting errors: > {noformat} > Caused by: java.io.IOException: Exception reading > file:/yarn/nm/usercache/yshi/appcache/application_1485271416004_0022/container_1485271416004_0022_01_000002/container_tokens > at > org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:188) > at > org.apache.hadoop.mapreduce.security.TokenCache.mergeBinaryTokens(TokenCache.java:155) > {noformat} > This is caused by passing the {{mapreduce.job.credentials.binary}} property > to the Spark configuration in RemoteHiveSparkClient. -- This message was sent by Atlassian JIRA (v6.4.14#64029)