I thought it might be a CA certificates issue, but it looks like
openjdk:8-jre-alpine includes the proper certificates.
You could just this just to make sure: exec into the container and run curl
-v https://s3.amazonaws.com. You may have to run apk add --no-cache curl
first.
Apart from that, a se
Here is what my docker file says:
ENV FLINK_VERSION=1.3.2 \
HADOOP_VERSION=27 \
SCALA_VERSION=2.11 \
On Wed, Oct 4, 2017 at 8:23 AM Hao Sun wrote:
> I am running Flink 1.3.2 with docker on kubernetes. My docker is using
> openjdk-8, I do not have hadoop, the version is 2.7, scala is 2.
I am running Flink 1.3.2 with docker on kubernetes. My docker is using
openjdk-8, I do not have hadoop, the version is 2.7, scala is 2.11. Thanks!
FROM openjdk:8-jre-alpine
On Wed, Oct 4, 2017 at 8:11 AM Chesnay Schepler wrote:
> I've found a few threads where an outdated jdk version on the
>
I've found a few threads where an outdated jdk version on the
server/client may be the cause.
Which Flink binary (specifically, for which hadoop version) are you using?
On 03.10.2017 20:48, Hao Sun wrote:
com.amazonaws.http.AmazonHttpClient - Unable to
execute HTTP r
I am using S3 for checkpointing and external ckp as well.
s3a://bucket/checkpoints/e58d369f5a181842768610b5ab6a500b
I have this exception, and not sure what I can do with it.
I guess to configure hadoop to use some SSLFactory?
I am not using hadoop, I am on kubernetes (in AWS) with S3
Thanks!