Hi, I'm using Hive 0.11, downloaded the tarball from Apache's website.
I have a Linux user called *admin * and i invoke the hive CLI using this user. In the hive terminal I created a table as follows: *hive> create table ptest (pkey INT, skey INT, fkey INT, rkey INT, units INT) row format delimited fields terminated by ',' lines terminated by '\n' stored as textfile;* *OK* *Time taken: 0.241 seconds* When i try to load data into the table i get the following error: *hive> LOAD DATA LOCAL INPATH '/home/admin/sample.csv' OVERWRITE INTO TABLE ptest;* *Copying data from file:/home/admin/sample.csv* *Copying file: file:/home/admin/sample.csv* *Loading data to table default.ptest* *rmr: DEPRECATED: Please use 'rm -r' instead.* *rmr: Permission denied: user=admin, access=ALL, inode="/user/hive_0.11/warehouse/ptest":root:hive:drwxr-xr-x* *Failed with exception Permission denied: user=admin, access=ALL, inode="/user/hive_0.11/warehouse/ptest":root:hive:drwxr-xr-x* * at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205) * * at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkSubAccess(FSPermissionChecker.java:174) * * at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:144) * * at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4684) * * at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:2794) * * at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:2757) * * at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:2740) * * at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:621) * * at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:406) * * at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44094) * * at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453) * * at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)* * at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)* * at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)* * at java.security.AccessController.doPrivileged(Native Method)* * at javax.security.auth.Subject.doAs(Subject.java:415)* * at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) * * at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)* * * *FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask* When I looked into the warehouse directory, *hive>dfs -ls /user/hive_0.11/warehouse;* *Found 1 item* *drwxr-xr-x - root hive 0 2013-08-05 17:04 /user/hive_0.11/warehouse/ptest* It seems like the file is owned by the root user even though it was created by invoking hive CLI from the user admin. I'm unable to figure out why the owner of the table has been assigned as root. Could anyone please help me out? Thank you, Sachin