Hi Rahman, On 24 March 2014 16:45, Abdelrahman Shettia <ashet...@hortonworks.com>wrote:
> Hi Oliver, > > Can you perform a simple test of hadoop fs -cat > hdfs:///logs/2014/03-24/actual_log_file_name.seq by the same user? Also > what are the configurations setting for the following? > Yes, I can access that file with the same user using "hadoop fs -cat" as well as other tools (I've been using Pig up until this point). > > hive.metastore.execute.setugi > I'm not setting this explicitly anywhere. > > hive.metastore.warehouse.dir > I have this set in my HiveQL script: SET hive.metastore.warehouse.dir=/user/oliver/warehouse; This directory already exists, since I created it before running the script. > > hive.metastore.uris > Not explicitly set anywhere to my knowledge. > > Thanks, > Rahman > > On Mar 24, 2014, at 8:17 AM, Oliver <ohook...@gmail.com> wrote: > > Hi, > > I have a bunch of data already in place in a directory on HDFS containing > many different logs of different types, so I'm attempting to load these > externally like so: > > CREATE EXTERNAL TABLE mylogs (line STRING) STORED AS SEQUENCEFILE LOCATION > 'hdfs:///logs/2014/03-24/actual_log_file_name.seq'; > > However I get this error back when doing so: > > FAILED: Execution Error, return code 1 from > org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got > exception: org.apache.hadoop.security.AccessControlException Permission > denied: user=oliver, access=WRITE, > inode="/logs/2014/03-24":logs:supergroup:drwxr-xr-x > at > org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:224) > at > org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:204) > at > org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:149) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4716) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4698) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4672) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:3035) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2999) > at > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2980) > at > org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:648) > at > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:419) > at > org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44970) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453) > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1701) > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1697) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Unknown Source) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1695) > ) > > This directory is intentionally read-only by regular users who want to > read the logs and analyse them. Am I missing some configuration data for > Hive that will tell it to only store metadata elsewhere? I already > have hive.metastore.warehouse.dir set to another location where I have > write permission. > > Best Regards, > Oliver > > > > CONFIDENTIALITY NOTICE > NOTICE: This message is intended for the use of the individual or entity > to which it is addressed and may contain information that is confidential, > privileged and exempt from disclosure under applicable law. If the reader > of this message is not the intended recipient, you are hereby notified that > any printing, copying, dissemination, distribution, disclosure or > forwarding of this communication is strictly prohibited. If you have > received this communication in error, please contact the sender immediately > and delete it from your system. Thank You.