[ https://issues.apache.org/jira/browse/FLINK-7905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16218479#comment-16218479 ]
Chesnay Schepler commented on FLINK-7905: ----------------------------------------- This just occurred again: https://travis-ci.org/zentol/flink/builds/292549997 {code} Running org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase Tests run: 3, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 3.1 sec <<< FAILURE! - in org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase testDirectoryListing(org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase) Time elapsed: 0.257 sec <<< ERROR! java.nio.file.AccessDeniedException: s3://[secure]/tests-3d000b6f-1118-4f61-91a3-57f893fdd654/testdir: getFileStatus on s3://[secure]/tests-3d000b6f-1118-4f61-91a3-57f893fdd654/testdir: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 143C577462F68C24), S3 Extended Request ID: u96bfOdpa2cjC7DcrVLiS6Qol/5Y3gz1RlbBwUp85eG51PwtwWqEJy/r8Cy5iLF3u2hyMfJLx44= at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1579) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1249) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1030) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:742) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:716) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667) at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649) at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4194) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4141) at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1256) at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1232) at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:904) at org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:1553) at org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:117) at org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.getFileStatus(HadoopFileSystem.java:77) at org.apache.flink.core.fs.FileSystem.exists(FileSystem.java:509) at org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase.testDirectoryListing(HadoopS3FileSystemITCase.java:163) testSimpleFileWriteAndRead(org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase) Time elapsed: 0.206 sec <<< ERROR! java.nio.file.AccessDeniedException: s3://[secure]/tests-3d000b6f-1118-4f61-91a3-57f893fdd654/test.txt: getFileStatus on s3://[secure]/tests-3d000b6f-1118-4f61-91a3-57f893fdd654/test.txt: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 24F178CAC9F6A2C4), S3 Extended Request ID: W8MWwBlF3UkvXw7VhuXGSiruZe+U/Gf/BpHPEMPr6/2ZLvKWfk/AXWthpkECcj35DYUAuV2l/HQ= at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1579) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1249) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1030) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:742) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:716) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667) at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649) at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4194) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4141) at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1256) at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1232) at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:904) at org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:1553) at org.apache.hadoop.fs.s3a.S3AFileSystem.delete(S3AFileSystem.java:1234) at org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.delete(HadoopFileSystem.java:134) at org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase.testSimpleFileWriteAndRead(HadoopS3FileSystemITCase.java:147) {code} > HadoopS3FileSystemITCase failed on travis > ----------------------------------------- > > Key: FLINK-7905 > URL: https://issues.apache.org/jira/browse/FLINK-7905 > Project: Flink > Issue Type: Bug > Components: FileSystem, Tests > Affects Versions: 1.4.0 > Environment: https://travis-ci.org/zentol/flink/jobs/291550295 > https://travis-ci.org/tillrohrmann/flink/jobs/291491026 > Reporter: Chesnay Schepler > Assignee: Stephan Ewen > Labels: test-stability > Fix For: 1.4.0 > > > The {{HadoopS3FileSystemITCase}} is flaky on Travis because its access got > denied by S3. > {code} > ------------------------------------------------------- > T E S T S > ------------------------------------------------------- > Running org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase > Tests run: 3, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 3.354 sec <<< > FAILURE! - in org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase > testDirectoryListing(org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase) > Time elapsed: 0.208 sec <<< ERROR! > java.nio.file.AccessDeniedException: > s3://[secure]/tests-9273972a-70c2-4f06-862e-d02936313fea/testdir: > getFileStatus on > s3://[secure]/tests-9273972a-70c2-4f06-862e-d02936313fea/testdir: > com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon > S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: > 9094999D7456C589), S3 Extended Request ID: > fVIcROQh4E1/GjWYYV6dFp851rjiKtFgNSCO8KkoTmxWbuxz67aDGqRiA/a09q7KS6Mz1Tnyab4= > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1579) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1249) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1030) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:742) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:716) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649) > at > com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513) > at > com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4194) > at > com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4141) > at > com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1256) > at > com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1232) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:904) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:1553) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:117) > at > org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.getFileStatus(HadoopFileSystem.java:77) > at org.apache.flink.core.fs.FileSystem.exists(FileSystem.java:509) > at > org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase.testDirectoryListing(HadoopS3FileSystemITCase.java:163) > testSimpleFileWriteAndRead(org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase) > Time elapsed: 0.275 sec <<< ERROR! > java.nio.file.AccessDeniedException: > s3://[secure]/tests-9273972a-70c2-4f06-862e-d02936313fea/test.txt: > getFileStatus on > s3://[secure]/tests-9273972a-70c2-4f06-862e-d02936313fea/test.txt: > com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon > S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: > B3D8126BE6CF169F), S3 Extended Request ID: > T34sn+a/CcCFv+kFR/UbfozAkXXtiLDu2N31Ok5EydgKeJF5I2qXRCC/MkxSi4ymiiVWeSyb8FY= > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1579) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1249) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1030) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:742) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:716) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667) > at > com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649) > at > com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513) > at > com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4194) > at > com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4141) > at > com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1256) > at > com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1232) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:904) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:1553) > at > org.apache.hadoop.fs.s3a.S3AFileSystem.delete(S3AFileSystem.java:1234) > at > org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.delete(HadoopFileSystem.java:134) > at > org.apache.flink.fs.s3hadoop.HadoopS3FileSystemITCase.testSimpleFileWriteAndRead(HadoopS3FileSystemITCase.java:147) > {code} -- This message was sent by Atlassian JIRA (v6.4.14#64029)