We had a similar thread on this ML where a user is executing through IDE.
It seems as FileSystems are not automatically initialized in LocalExecutor
and you should do it manually as a workaround [1] in your main before
accessing the FileSystems.

[1]
https://github.com/apache/flink/blob/master/flink-core/src/main/java/org/apache/flink/core/fs/FileSystem.java#L319-L319

On Tue, Oct 19, 2021 at 6:36 PM Pavel Penkov <ebonfortr...@gmail.com> wrote:

> I've placed a flink-conf.yaml file in conf dir but
> StreamExecutionEnvironment.getExecutionEnvironment doesn't pick it up. If
> set programmatically keys are visible in Flink Web UI, they are just not
> passed to Hadoop FS.
>
> On 2021/10/18 03:04:04, Yangze Guo <k...@gmail.com> wrote:
> > Hi, Pavel.>
> >
> > From my understanding of the doc[1], you need to set it in>
> > flink-conf.yaml instead of your job.>
> >
> > [1]
> https://ci.apache.org/projects/flink/flink-docs-master/docs/deployment/filesystems/s3/#hadooppresto-s3-file-systems-plugins>
>
> >
> > Best,>
> > Yangze Guo>
> >
> > On Sat, Oct 16, 2021 at 5:46 AM Pavel Penkov <eb...@gmail.com> wrote:>
> > >>
> > > Apparently Flink 1.14.0 doesn't correctly translate S3 options when
> they are set programmatically. I'm creating a local environment like this
> to connect to local MinIO instance:>
> > >>
> > >   val flinkConf = new Configuration()>
> > >   flinkConf.setString("s3.endpoint", "http://127.0.0.1:9000";)>
> > >   flinkConf.setString("s3.aws.credentials.provider",
> "org.apache.hadoop.fs.s3a.AnonymousAWSCredentialsProvider")>
> > >>
> > >   val env =
> StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(flinkConf)>
> > >>
> > > Then StreamingFileSink fails with a huge stack trace with most
> relevant messages being Caused by:
> org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS Credentials
> provided by SimpleAWSCredentialsProvider
> EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider :
> com.amazonaws.SdkClientException: Failed to connect to service endpoint:
>  which means that Hadoop tried to enumerate all of the credential providers
> instead of using the one set in configuration. What am I doing wrong?>
> >
>

Reply via email to