[ 
https://issues.apache.org/jira/browse/SQOOP-3043?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15635821#comment-15635821
 ] 

Ramesh B commented on SQOOP-3043:
---------------------------------

This is due to following code 
{code}
private void removeTempLogs(Path tablePath) throws IOException {
    FileSystem fs = FileSystem.get(configuration);
    Path logsPath = new Path(tablePath, "_logs");
    if (fs.exists(logsPath)) {...
{code}

I have provided pull request here
https://github.com/apache/sqoop/pull/29/files

> Sqoop HiveImport fails with Wrong FS while removing the _logs 
> --------------------------------------------------------------
>
>                 Key: SQOOP-3043
>                 URL: https://issues.apache.org/jira/browse/SQOOP-3043
>             Project: Sqoop
>          Issue Type: Bug
>          Components: hive-integration
>            Reporter: Ramesh B
>
> With s3:// as --target-dir and --hive-import provided sqoop fails with 
> {code}ERROR tool.ImportTool: Imported Failed: Wrong FS: 
> s3a://dataplatform/sqoop/target/user/_logs, expected: hdfs://nn1
> {code}
> This is due to removeTempLogs method in HiveImport.java which is expecting 
> hdfs as the path.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to