[ 
https://issues.apache.org/jira/browse/HIVE-12746?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15071382#comment-15071382
 ] 

wangfeng commented on HIVE-12746:
---------------------------------

I see the HiveMetaStore.java  code, the method 
private boolean drop_table_core(final RawStore ms, final String dbname, final 
String name,
        final boolean deleteData, final EnvironmentContext envContext,
        final String indexName) throws NoSuchObjectException,
        MetaException, IOException, InvalidObjectException, 
InvalidInputException)

related code was:

        isExternal = isExternal(tbl);
        if (tbl.getSd().getLocation() != null) {
          tblPath = new Path(tbl.getSd().getLocation());
          if (!wh.isWritable(tblPath.getParent())) {
            String target = indexName == null ? "Table" : "Index table";
            throw new MetaException(target + " metadata not deleted since " +
                tblPath.getParent() + " is not writable by " +
                hiveConf.getUser());
          }
        }

here I suggest that when the code checks the hdfs path isWritable, should first 
see the table is isExternal.Only if false,hdfs permission should check.   

> when dropping external hive tables,hive metastore should not check the hdfs 
> path write permission
> -------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-12746
>                 URL: https://issues.apache.org/jira/browse/HIVE-12746
>             Project: Hive
>          Issue Type: Bug
>          Components: Metastore
>    Affects Versions: 1.2.1
>         Environment: hive1.2.1 hadoop2.6
>            Reporter: wangfeng
>            Priority: Critical
>              Labels: hdfspermission, metastore
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> 1 user1 has read permission on hdfs path '/user/www/seller_shop_info';
> 2 user1 create external table seller_shop_info on the hdfs path;
> 3 user1 drop the exernal table seller_shop_info
> then problem occurred!
> hive> drop table seller_shop_info;
> FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Table metadata 
> not deleted since hdfs://argo/user/www/seller_shop_info is not writable by 
> user1)
> because when dropping external table,hive doesnot delete hdfs path,so hive 
> metastore should not check the hdfs write permission



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to