t oo created HADOOP-15542:
-----------------------------

             Summary: S3AFileSystem - FileAlreadyExistsException when prefix is 
a file and part of a directory tree
                 Key: HADOOP-15542
                 URL: https://issues.apache.org/jira/browse/HADOOP-15542
             Project: Hadoop Common
          Issue Type: Bug
          Components: tools
    Affects Versions: 2.7.5, 3.1.0
            Reporter: t oo


We are running Apache Spark jobs with aws-java-sdk-1.7.4.jar  
hadoop-aws-2.7.5.jar to write parquet files to an S3 bucket. We have the key 
's3://mybucket/d1/d2/d3/d4/d5/d6/d7' in s3 (d7 being a text file). We also have 
keys 's3://mybucket/d1/d2/d3/d4/d5/d6/d7/d8/d9/part_dt=20180615/a.parquet' 
(a.parquet being a file)

When we run a spark job to write b.parquet file under 
's3://mybucket/d1/d2/d3/d4/d5/d6/d7/d8/d9/part_dt=20180616/' (ie would like to 
have 's3://mybucket/d1/d2/d3/d4/d5/d6/d7/d8/d9/part_dt=20180616/b.parquet' get 
created in s3) we get the below error

 

 

org.apache.hadoop.fs.FileAlreadyExistsException: Can't make directory for path 
's3a://mybucket/d1/d2/d3/d4/d5/d6/d7' since it is a file.

at org.apache.hadoop.fs.s3a.S3AFileSystem.mkdirs(S3AFileSystem.java:861)

at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1881)

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

Reply via email to