Are you using SQL Authorisation? If you create tables using the hive cli,
you won't be able to select the table from a connection  to the hive
server.

On Thu, 18 Apr 2019 at 04:34, Alan Gates <alanfga...@gmail.com> wrote:

> See
> https://cwiki.apache.org/confluence/display/Hive/Setting+up+HiveServer2#SettingUpHiveServer2-Impersonation
>
> Alan.
>
> On Tue, Apr 16, 2019 at 10:03 PM Kaidi Zhao <kz...@salesforce.com> wrote:
>
>> Hello!
>>
>> Did I miss anything here or it is an known issue? Hive 1.2.1, hadoop
>> 2.7.x, kerberos, impersonation.
>>
>> Using hive client, create a hive db and hive table. I can select from
>> this table correctly.
>> In hdfs, change the table folder's permission to be 711. In hive client,
>> I can still select from the table.
>> However, if using beeline client (which talks to HS2 I believe), it
>> complains about can't read the table folder in hdfs, something like:
>>
>> Error: Error while compiling statement: FAILED: SemanticException Unable
>> to fetch table fact_app_logs. java.security.AccessControlException:
>> Permission denied: user=hive, access=READ,
>> inode="/data/mydb.db/my_table":myuser:mygroup:drwxr-x--x
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:307)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:220)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1752)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1736)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1710)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8220)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:1932)
>> at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1455)
>> at
>> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>> at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
>> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
>> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2218)
>> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2214)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:422)
>> at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1760)
>> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2212)
>> (state=42000,code=40000)
>>
>> Note, from the log, it says it tries to use user "hive" (instead of my
>> own user "myuser") to read the table's folder (the folder is only readable
>> by its owner - myuser)
>> Again, using hive client I can read the table, but using beeline it
>> can't.
>> If I change the folder's permission to 755, then it works.
>>
>> Why beeline / HS2 needs to use "hive" to read the table's folder?
>>
>> Thanks in advance.
>>
>> Kaidi
>>
>>
>>

Reply via email to