Hi all,

I am trying to use S3 backend with custom endpoint. However, it is not
supported in hadoop-aws@2.7.3, I need to use at least 2.8.0 version. The
underyling reason is that the requests are being sent as following

DEBUG [main] (AmazonHttpClient.java:337) - Sending Request: HEAD
http://mustafa.localhost:9000 / Headers:

Because "fs.s3a.path.style.access" is not recognized in old version.I want
the domain to remain same, the bucket name to be appended in the path (
http://localhost:9000/mustafa/...)

I cannot blindly increase aws-java-sdk version to latest, it causes:

Caused by: java.lang.NoClassDefFoundError: Could not initialize class
com.amazonaws.ClientConfiguration
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:182)

So, If I increase the hadoop-aws to 2.8.0 with latest client, it causes the
following error:


According to, I need hadoop-aws@2.7.2 and
https://ci.apache.org/projects/flink/flink-docs-release-1.3/setup/aws.html#provide-s3-filesystem-dependency

Caused by: java.lang.IllegalAccessError: tried to access method
org.apache.hadoop.metrics2.lib.MutableCounterLong.<init>(Lorg/apache/hadoop/metrics2/MetricsInfo;J)V
from class org.apache.hadoop.fs.s3a.S3AInstrumentation
at
org.apache.hadoop.fs.s3a.S3AInstrumentation.streamCounter(S3AInstrumentation.java:194)


Should I be excluding hadoop-common from Flink somehow? Building flink from
source with mvn clean install -DskipTests -Dhadoop.version=2.8.0 works but
I want to manage it via maven as much as possible.

Reply via email to