Thanks for the help, guys. I can work with that.
Maybe it makes sense to add something like that to the parquet doc file:
https://nightlies.apache.org/flink/flink-docs-release-1.16/docs/connectors/table/formats/parquet/
This documentation does not mention hadoop at all, and it seemed just as
strai
Hi Frank,
Parquet always requires Hadoop. There is a Parquet ticket to make it
possible to read/write Parquet without depending on Hadoop, but that's
still open. So in order for Flink to be able to work with Hadoop, it
requires the necessary Hadoop dependencies as outlined in
https://nightlies.apa
Hi all, I’m using the Flink k8s operator to run a SQL stream to/from
various connectors, and just added a Parquet format. I customized the image
a bit per the example (mostly by adding maven downloads of flink-connector*
jars). If I do that for flink-parquet-1.16.1 it fails on missing
org/apache/ha