Hi,
I found a permission problem when I created a hive table from hive
context in spark shell version 1.2.1. Then I tried to update this table,
but got AccessControlException because this table is owned by hive, not
my account.
From the hive context,
hiveContext.sql("create table orc_table(key INT, value STRING) stored as
orc")
hiveContext.hql("INSERT INTO table orc_table select * from testtable")
--> Caused by:
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
Permission denied: user=patcharee, access=WRITE,
inode="/apps/hive/warehouse/orc_table":hive:hdfs:drwxr-xr-x
From hadoop dsf, the table is owned by hive.
[patcharee@machine-10-0 ~]$ hadoop fs -ls /apps/hive/warehouse/
drwxr-xr-x - hive hdfs 0 2015-05-18 11:02
/apps/hive/warehouse/orc_table
Any ideas?
BR,
Patcharee
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org