This is a limitation of the presto version; use
flink-s3-fs-hadoop-1.11.3.jar instead.
On 08/09/2021 20:39, Dhiru wrote:
I copied
FROM flink:1.11.3-scala_2.12-java11 RUN mkdir
./plugins/flink-s3-fs-presto RUN cp
./opt/flink-s3-fs-presto-1.11.3.jar ./plugins/flink-s3-fs-presto/
then started
I copied FROM flink:1.11.3-scala_2.12-java11
RUN mkdir ./plugins/flink-s3-fs-presto
RUN cp ./opt/flink-s3-fs-presto-1.11.3.jar ./plugins/flink-s3-fs-presto/
then started getting this error , trying to run on aws eks and trying to access
s3 bucket 2021-09-08 14:38:10java.lang.UnsupportedOperatio
you need to put the flink-s3-fs-hadoop/presto jar into a directory
within the plugins directory, for example the final path should look
like this:
/opt/flink/plugins/flink-s3-fs-hadoop/flink-s3-fs-hadoop-1.13.1.jar
Furthermore, you only need either the hadoop or presto jar, _not_ both
of them
yes I copied to plugin folder but not sure same jar I see in /opt as well by
default
root@d852f125da1f:/opt/flink/plugins# lsREADME.txt
flink-s3-fs-hadoop-1.13.1.jar metrics-datadog metrics-influx
metrics-prometheus metrics-statsdexternal-resource-gpu
flink-s3-fs-presto-1.1
Need to configure aws S3 getting this error
org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find
a file system implementation for scheme 's3'. The scheme is directly supported
by Flink through the following plugins: flink-s3-fs-hadoop, flink-s3-fs-presto.
Please ensur