Hi Dan, somehow enabling debug statements did not work.
However, the logs helps to narrow down the issue. The exception occurs neither on jobmanager nor on taskmanager. It occurs wherever you execute the command line interface. How do you execute the job? Do you start it from your machine? Can you try out to also add the respective s3 plugin there? Best, Arvid On Thu, Sep 10, 2020 at 7:50 PM Dan Hill <quietgol...@gmail.com> wrote: > I changed the levels to DEBUG. I don't see useful data in the logs. > > > https://drive.google.com/file/d/1ua1zsr3BInY_8xdsWwA__F0uloAqy-vG/view?usp=sharing > > On Thu, Sep 10, 2020 at 8:45 AM Arvid Heise <ar...@ververica.com> wrote: > >> Could you try 1) or 2) and enable debug logging* and share the log with >> us? >> >> *Usually by adjusting FLINK_HOME/conf/log4j.properties. >> >> On Thu, Sep 10, 2020 at 5:38 PM Dan Hill <quietgol...@gmail.com> wrote: >> >>> Ah, sorry, it's a copy/paste issue with this email. I've tried both: >>> 1) using s3a uri with flink-s3-fs-hadoop jar >>> in /opt/flink/plugins/s3-fs-hadoop. >>> 2) using s3p uri with flink-s3-fs-presto jar >>> in /opt/flink/plugins/s3-fs-presto. >>> 3) loading both 1 and 2 >>> 4) trying s3 uri. >>> >>> When doing 1) >>> >>> Caused by: >>> org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not >>> find a file system implementation for scheme 's3a'. The scheme is directly >>> supported by Flink through the following plugin: flink-s3-fs-hadoop. Please >>> ensure that each plugin resides within its own subfolder within the plugins >>> directory. See >>> https://ci.apache.org/projects/flink/flink-docs-stable/ops/plugins.html >>> for more information. If you want to use a Hadoop file system for that >>> scheme, please add the scheme to the configuration >>> fs.allowed-fallback-filesystems. For a full list of supported file systems, >>> please see >>> https://ci.apache.org/projects/flink/flink-docs-stable/ops/filesystems/. >>> >>> When doing 2) >>> >>> Caused by: >>> org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not >>> find a file system implementation for scheme 's3p'. The scheme is directly >>> supported by Flink through the following plugin: flink-s3-fs-presto. Please >>> ensure that each plugin resides within its own subfolder within the plugins >>> directory. See >>> https://ci.apache.org/projects/flink/flink-docs-stable/ops/plugins.html >>> for more information. If you want to use a Hadoop file system for that >>> scheme, please add the scheme to the configuration >>> fs.allowed-fallback-filesystems. For a full list of supported file systems, >>> please see >>> https://ci.apache.org/projects/flink/flink-docs-stable/ops/filesystems/. >>> >>> etc >>> >>> On Thu, Sep 10, 2020 at 8:15 AM Arvid Heise <ar...@ververica.com> wrote: >>> >>>> Hi Dan, >>>> >>>> s3p is only provided by flink-s3-fs-presto plugin. The plugin you used >>>> provides s3a. >>>> (and both provide s3, but it's good to use the more specific prefix). >>>> >>>> Best, >>>> >>>> Arvid >>>> >>>> On Thu, Sep 10, 2020 at 9:24 AM Dan Hill <quietgol...@gmail.com> wrote: >>>> >>>>> *Background* >>>>> I'm converting some prototype Flink v1.11.1 code that uses >>>>> DataSet/DataTable APIs to use the Table API. >>>>> >>>>> *Problem* >>>>> When switching to using the Table API, my s3 plugins stopped working. >>>>> I don't know why. I've added the required maven table dependencies to the >>>>> job. >>>>> >>>>> I've tried us moving both the presto and/or the hadoop s3 jars to >>>>> plugin subfolders. No luck. >>>>> >>>>> Any ideas what is wrong? I'm guessing I'm missing something simple. >>>>> >>>>> >>>>> *Error* >>>>> >>>>> Caused by: >>>>> org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not >>>>> find a file system implementation for scheme 's3p'. The scheme is directly >>>>> supported by Flink through the following plugin: flink-s3-fs-presto. >>>>> Please >>>>> ensure that each plugin resides within its own subfolder within the >>>>> plugins >>>>> directory. See >>>>> https://ci.apache.org/projects/flink/flink-docs-stable/ops/plugins.html >>>>> for more information. If you want to use a Hadoop file system for that >>>>> scheme, please add the scheme to the configuration >>>>> fs.allowed-fallback-filesystems. For a full list of supported file >>>>> systems, >>>>> please see >>>>> https://ci.apache.org/projects/flink/flink-docs-stable/ops/filesystems/ >>>>> . >>>>> >>>>> at >>>>> org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:473) >>>>> >>>>> at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:389) >>>>> >>>>> at org.apache.flink.core.fs.Path.getFileSystem(Path.java:292) >>>>> >>>>> at >>>>> org.apache.flink.table.filesystem.FileSystemTableSink.toStagingPath(FileSystemTableSink.java:232) >>>>> >>>>> ... 35 more >>>>> >>>>> *ls of plugins directory (same for taskmanager)* >>>>> >>>>> kubectl exec pod/flink-jobmanager-0 -- ls -l >>>>> /opt/flink/plugins/s3-fs-hadoop >>>>> >>>>> total 19520 >>>>> >>>>> -rw-r--r-- 1 root root 19985452 Sep 10 06:27 >>>>> flink-s3-fs-hadoop-1.11.1.jar >>>>> >>>>> >>>>> >>>> >>>> -- >>>> >>>> Arvid Heise | Senior Java Developer >>>> >>>> <https://www.ververica.com/> >>>> >>>> Follow us @VervericaData >>>> >>>> -- >>>> >>>> Join Flink Forward <https://flink-forward.org/> - The Apache Flink >>>> Conference >>>> >>>> Stream Processing | Event Driven | Real Time >>>> >>>> -- >>>> >>>> Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany >>>> >>>> -- >>>> Ververica GmbH >>>> Registered at Amtsgericht Charlottenburg: HRB 158244 B >>>> Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason, Ji >>>> (Toni) Cheng >>>> >>> >> >> -- >> >> Arvid Heise | Senior Java Developer >> >> <https://www.ververica.com/> >> >> Follow us @VervericaData >> >> -- >> >> Join Flink Forward <https://flink-forward.org/> - The Apache Flink >> Conference >> >> Stream Processing | Event Driven | Real Time >> >> -- >> >> Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany >> >> -- >> Ververica GmbH >> Registered at Amtsgericht Charlottenburg: HRB 158244 B >> Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason, Ji >> (Toni) Cheng >> > -- Arvid Heise | Senior Java Developer <https://www.ververica.com/> Follow us @VervericaData -- Join Flink Forward <https://flink-forward.org/> - The Apache Flink Conference Stream Processing | Event Driven | Real Time -- Ververica GmbH | Invalidenstrasse 115, 10115 Berlin, Germany -- Ververica GmbH Registered at Amtsgericht Charlottenburg: HRB 158244 B Managing Directors: Timothy Alexander Steinert, Yip Park Tung Jason, Ji (Toni) Cheng