I make the value of 'hive.metastore.warehouse.dir" writable for all users on
Hadoop.
And the problem is solved. Tables can be created in my spec dir.
But this operation is not safe.

I don't understand the strategy of access control between hive and hdfs.
Will you help me ?

thx.

2011/5/18 jinhang du <dujinh...@gmail.com>

> Thanks for your answers.
>
> I run my hive client console on a machine which doesn't belong to the
> hadoop cluster.
> I just changed "fs.default.name" and "mapred.job.tracker" in hive-site.xml
> to connect the hadoop cluster.
> I can create table and files are created on HDFS.
> When  I changed the hive.metastore.warehouse.dir, the exception appeared.
>
> My account to access hadoop cluster doesn't have the authority to write in
> path "/user/hive/warehouse".
> This owner of the directory is superuser. This directory was created by
> someone else.
>
> Is there something wrong about my operations or my understanding about
> hive?
> Can hive just act a client without any changes on hadoop configuration?
>
> Thanks.
>
> 2011/5/18 Ted Yu <yuzhih...@gmail.com>
>
>> Can you try as user hadoop ?
>>
>> Cheers
>>
>>
>>
>> On May 17, 2011, at 9:53 PM, jinhang du <dujinh...@gmail.com> wrote:
>>
>> > hi,
>> > The default value is "/user/hive/warehouse" in hive.site.xml. After I
>> changed the directory to a path on HDFS, I got the exception.
>> >
>> > FAILED: Error in metadata: MetaException(message:Got exception:
>> org.apache.hadoop.security.
>> > AccessControlException
>> org.apache.hadoop.security.AccessControlException: Permission denied:
>> > user=root, access=WRITE, inode="output
>> > FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask
>> >
>> > Is this failure related to the hadoop-site.xml or something?
>> > Thanks for your help.
>> >
>> > --
>> > dujinhang
>>
>
>
>
> --
> dujinhang
>



-- 
dujinhang

Reply via email to