Thanks David. It worked after adding the jar inside a folder.
On Sat, Feb 1, 2020 at 2:37 AM David Magalhães
wrote:
> Did you put each inside a different folder with their name? Like
> /opt/flink/plugins/s3-fs-presto/flink-s3-fs-presto-1.9.1.jar ?
>
> check
> https://ci.apache.org/projects/flink
Did you put each inside a different folder with their name? Like
/opt/flink/plugins/s3-fs-presto/flink-s3-fs-presto-1.9.1.jar ?
check
https://ci.apache.org/projects/flink/flink-docs-stable/ops/filesystems/s3.html
On Sat, Feb 1, 2020, 07:42 Navneeth Krishnan
wrote:
> Hi Arvid,
>
> Thanks for the
Hi Arvid,
Thanks for the response.
I have both the jars under /opt/flink/plugins but I'm still getting the
same error message. Also can someone please provide some pointers on how
entropy works. How should I setup the directory structure?
In the link that you have provided there is a aws-credent
Hi Navneeth,
did you follow the plugin folder structure? [1]
There is another plugin called flink-s3-fs-presto that you can use.
If you want to use both plugins, use s3a:// for s3-fs-hadoop (output) and
s3p:// for s3-fs-presto (checkpointing).
[1]
https://ci.apache.org/projects/flink/flink-docs-
Hi All,
I'm trying to migrate from NFS to S3 for checkpointing and I'm facing few
issues. I have flink running in docker with flink-s3-fs-hadoop jar copied
to plugins folder. Even after having the jar I'm getting the following
error: Caused by:
org.apache.flink.core.fs.UnsupportedFileSystemSchemeE