[ 
https://issues.apache.org/jira/browse/HIVE-22649?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17007776#comment-17007776
 ] 

Jason Dere commented on HIVE-22649:
-----------------------------------

Hi [~dkuzmenko], sorry for responding late on this one.

It may be possible that HIVE-22599 causes this, if the logic to initialize the 
query results cache is called before the call to ensurePathIsWritable(). In 
that case the call to mkdirs() in QueryResultsCache.QueryResultsCache() might 
be the first to create the "/tmp/hive" directory (and with default perms per 
[http://hadoop.apache.org/docs/r2.7.5/api/src-html/org/apache/hadoop/fs/FileSystem.html#line.596]).

Trying to think of what options we have here:

1) Revert HIVE-22599

2) Make sure that ensurePathIsWritable(HiveConf.ConfVars.SCRATCHDIR) is called 
before QueryResultsCache.initialize(), during HiveServer2.init() (or maybe 
within QueryResultsCache.initialize() itself),  to ensure that the session tmp 
dir is initialized with the correct perms.

> Fix TestHiveCli: scratchdir should be writable
> ----------------------------------------------
>
>                 Key: HIVE-22649
>                 URL: https://issues.apache.org/jira/browse/HIVE-22649
>             Project: Hive
>          Issue Type: Sub-task
>            Reporter: Denys Kuzmenko
>            Assignee: Denys Kuzmenko
>            Priority: Major
>         Attachments: HIVE-22649.1.patch, HIVE-22649.2.patch
>
>
> Error applying authorization policy on hive configuration: The dir: /tmp/hive 
> on HDFS should be writable. Current permissions are: rwxr-xr-x
> SessionState.java
> {code}
>   private Path createRootHDFSDir(HiveConf conf) throws IOException {
>     Path rootHDFSDirPath = new Path(HiveConf.getVar(conf, 
> HiveConf.ConfVars.SCRATCHDIR));
>     *Utilities.ensurePathIsWritable(rootHDFSDirPath, conf);*
>     return rootHDFSDirPath;
>   }
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to