are you member of group "*g_**hdp_storeops*" ?

On Mon, May 25, 2015 at 6:21 AM, r7raul1...@163.com <r7raul1...@163.com>
wrote:

> config hdfs acl
>
> http://zh.hortonworks.com/blog/hdfs-acls-fine-grained-permissions-hdfs-files-hadoop/
>
> ------------------------------
> r7raul1...@163.com
>
>
> *From:* Anupam sinha <ak3...@gmail.com>
> *Date:* 2015-05-21 12:44
> *To:* user <user@hive.apache.org>
> *Subject:* Re: upgrade of Hadoop cluster from 5.1.2 to 5.2.1.
>
> Hello everyone,
>
>
> i am a member of nested group which has select priviledges.
>
>
> Still not able to access Hive StoreOps Database,
>
>
> Please suggest on this.
>
>
> Thank you.
>
>
>
>
>
>
> On Thu, May 21, 2015 at 9:30 AM, Anupam sinha <ak3...@gmail.com> wrote:
>
>> I have upgrade of Hadoop cluster from 5.1.2 to 5.2.1.
>> Now i am unable to read the hive tables,
>> before the upgrade i am able to access the Hive StoreOps Database.
>>
>> Please suggest any changes i need to do
>>
>>
>> Here is the output i receive,
>>
>> java.io.IOException: org.apache.hadoop.security.AccessControlException:
>> Permission denied: user=dmadjar, access=EXECUTE,
>> inode=&quot;/smith/storeops&quot;:srv-hdp-storeops-d:g_hdp_storeops:drwxrwx---
>> at
>> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:255)
>> at
>> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:236)
>> at
>> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:178)
>> at
>> org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6250)
>> at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3942)
>> at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:811)
>> at
>> org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:502)
>> at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:815)
>> at
>> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>> at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:587)
>>
>>
>>
>> Thank you,
>>
>
>

Reply via email to