[ 
https://issues.apache.org/jira/browse/HIVE-22758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17021649#comment-17021649
 ] 

Hive QA commented on HIVE-22758:
--------------------------------



Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12991499/HIVE-22758.1.patch

{color:red}ERROR:{color} -1 due to build exiting with an error

Test results: 
https://builds.apache.org/job/PreCommit-HIVE-Build/20283/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/20283/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-20283/

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit 
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2020-01-23 00:51:12.558
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ 
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-20283/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2020-01-23 00:51:12.561
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at 05cabc8 HIVE-22666: Introduce TopNKey operator for PTF Reduce 
Sink (Krisztian Kasa, reviewed by Jesus Camacho Rodriguez)
+ git clean -f -d
Removing standalone-metastore/metastore-server/src/gen/
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at 05cabc8 HIVE-22666: Introduce TopNKey operator for PTF Reduce 
Sink (Krisztian Kasa, reviewed by Jesus Camacho Rodriguez)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2020-01-23 00:51:13.475
+ rm -rf ../yetus_PreCommit-HIVE-Build-20283
+ mkdir ../yetus_PreCommit-HIVE-Build-20283
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-20283
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-20283/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh 
/data/hiveptest/working/scratch/build.patch
Trying to apply the patch with -p0
fatal: corrupt patch at line 42
Trying to apply the patch with -p1
fatal: corrupt patch at line 42
Trying to apply the patch with -p2
fatal: corrupt patch at line 42
The patch does not appear to apply with p0, p1, or p2
+ result=1
+ '[' 1 -ne 0 ']'
+ rm -rf yetus_PreCommit-HIVE-Build-20283
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12991499 - PreCommit-HIVE-Build

> Create database with permission error when doas set to true
> -----------------------------------------------------------
>
>                 Key: HIVE-22758
>                 URL: https://issues.apache.org/jira/browse/HIVE-22758
>             Project: Hive
>          Issue Type: Bug
>          Components: Standalone Metastore
>    Affects Versions: 3.0.0, 3.1.0
>            Reporter: Chiran Ravani
>            Assignee: Chiran Ravani
>            Priority: Critical
>         Attachments: HIVE-22758.1.patch
>
>
> With doAs set to true, running create database on external location fails 
> with permission denied for write access on the directory for hive user (User 
> HMS is running as).
> Steps to reproduce the issue:
> 1. Turn on, Hive run as end-user to true.
> 2. Connect to hive as some user other than admin, eg:- chiran
> 3. Create a database with external location
> {code}
> create database externaldbexample location '/user/chiran/externaldbexample'
> {code}
> The above statement fails as write access is not available to hive service 
> user on HDFS as below.
> {code}
> > create database externaldbexample location '/user/chiran/externaldbexample';
> INFO  : Compiling 
> command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d): 
> create database externaldbexample location '/user/chiran/externaldbexample'
> INFO  : Semantic Analysis Completed (retrial = false)
> INFO  : Returning Hive schema: Schema(fieldSchemas:null, properties:null)
> INFO  : Completed compiling 
> command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d); 
> Time taken: 1.377 seconds
> INFO  : Executing 
> command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d): 
> create database externaldbexample location '/user/chiran/externaldbexample'
> INFO  : Starting task [Stage-0:DDL] in serial mode
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. 
> MetaException(message:java.lang.reflect.UndeclaredThrowableException)
> INFO  : Completed executing 
> command(queryId=hive_20200122043626_5c95e1fd-ce00-45fd-b58d-54f5e579f87d); 
> Time taken: 0.238 seconds
> Error: Error while processing statement: FAILED: Execution Error, return code 
> 1 from org.apache.hadoop.hive.ql.exec.DDLTask. 
> MetaException(message:java.lang.reflect.UndeclaredThrowableException) 
> (state=08S01,code=1)
> {code}
> From Hive Metastore service log, below is seen.
> {code}
> 2020-01-22T04:36:27,870 WARN  [pool-6-thread-6]: metastore.ObjectStore 
> (ObjectStore.java:getDatabase(1010)) - Failed to get database 
> hive.externaldbexample, returning NoSuchObjectExcept
> ion
> 2020-01-22T04:36:27,898 INFO  [pool-6-thread-6]: metastore.HiveMetaStore 
> (HiveMetaStore.java:run(1339)) - Creating database path in managed directory 
> hdfs://c470-node2.squadron.support.
> hortonworks.com:8020/user/chiran/externaldbexample
> 2020-01-22T04:36:27,903 INFO  [pool-6-thread-6]: utils.FileUtils 
> (FileUtils.java:mkdir(170)) - Creating directory if it doesn't exist: 
> hdfs://namenodeaddress:8020/user/chiran/externaldbexample
> 2020-01-22T04:36:27,932 ERROR [pool-6-thread-6]: utils.MetaStoreUtils 
> (MetaStoreUtils.java:logAndThrowMetaException(169)) - Got exception: 
> org.apache.hadoop.security.AccessControlException Permission denied: 
> user=hive, access=WRITE, inode="/user/chiran":chiran:chiran:drwxr-xr-x
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:255)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:193)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1859)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1843)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1802)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:59)
>         at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3150)
>         at 
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1126)
>         at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:707)
>         at 
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>         at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:524)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1025)
>         at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:876)
>         at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:822)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2682)
> {code}
> The behavior looks to be a regression of HIVE-20001.
> Can we add a check to see if the external path specified by a user, then 
> create DB directory as end-user instead of hive service user?
> Attaching patch, which was tested locally on Hive 3.1



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to