Hi,

If path aleady exist in hdfs, then you need to perform following step.

$HADOOP_HOME/bin/hadoop fs  -chmod g+w Exist_Path_hdfs

Example:
$HADOOP_HOME/bin/hadoop fs  -chmod g+w /com/impetus/data

On Wed, May 18, 2011 at 1:25 AM, Ankit Jain <ankitjainc...@gmail.com> wrote:

> Hi,
>
> At the time installation we perform following steps:
>
> $HADOOP_HOME/bin/hadoop fs –mkdir  /user/hive/warehouse
> $HADOOP_HOME/bin/hadoop fs  -chmod g+w /user/hive/warehouse
>
> Replace /user/hive/warehouse with a new path.
>
> Example:-
> $HADOOP_HOME/bin/hadoop fs -mkdir  /com/impetus/data
> $HADOOP_HOME/bin/hadoop fs  -chmod g+w /com/impetus/data
>
> then your data start storing in /com/impetus/data dir.
>
> Regards,
> Ankit Jain
>
>
> On Wed, May 18, 2011 at 12:53 AM, jinhang du <dujinh...@gmail.com> wrote:
>
>> hi,
>> The default value is "/user/hive/warehouse" in hive.site.xml. After I
>> changed the directory to a path on HDFS, I got the exception.
>>
>> FAILED: Error in metadata: MetaException(message:Got exception:
>> org.apache.hadoop.security.
>> AccessControlException org.apache.hadoop.security.AccessControlException:
>> Permission denied:
>> user=root, access=WRITE, inode="output
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask
>>
>> Is this failure related to the hadoop-site.xml or something?
>> Thanks for your help.
>>
>> --
>> dujinhang
>>
>
>

Reply via email to