;Bajaj, Abhinav"
Subject: Re: Unable to load AWS credentials: Flink 1.2.1 + S3 + Kubernetes
Hi!
This is pretty much all in Hadoop's magic, from Flink's view, once this has
been delegated to s3a.
I seem to recall that there was something with older hadoop-aws versions or AWS
SDK version
w.
>
>
>
> Thanks for your response.
>
>
>
> ~ Abhinav
>
>
>
>
>
>
>
> *From: *Stephan Ewen
> *Date: *Thursday, March 29, 2018 at 2:30 AM
> *To: *"dyana.rose"
> *Cc: *user
> *Subject: *Re: Unable to load AWS credentials: Flink 1.2
~ Abhinav
From: Stephan Ewen
Date: Thursday, March 29, 2018 at 2:30 AM
To: "dyana.rose"
Cc: user
Subject: Re: Unable to load AWS credentials: Flink 1.2.1 + S3 + Kubernetes
Using AWS credentials with Kubernetes are not trivial. Have you looked at AWS /
Kubernetes docs and projects lik
Using AWS credentials with Kubernetes are not trivial. Have you looked at
AWS / Kubernetes docs and projects like https://github.com/jtblin/kube2iam
which bridge between containers and AWS credentials?
Also, Flink 1.2.1 is quite old, you may want to try a newer version. 1.4.x
has a bit of an overh
Hiya,
This sounds like it may be similar to the issue I had when running on ECS. Take
a look at my ticket for how I got around this, and see if it's any help:
https://issues.apache.org/jira/browse/FLINK-8439
Dyana
On 2018/03/28 02:15:06, "Bajaj, Abhinav" wrote:
> Hi,
>
> I am trying to use
Hi,
I am trying to use Flink 1.2.1 with RockDB as statebackend and S3 for
checkpoints.
I am using Flink 1.2.1 docker images and running them in Kubernetes cluster.
I have followed the steps documented in the Flink documentation -
https://ci.apache.org/projects/flink/flink-docs-release-1.2/setup/