BTW, I checked dependency tree, the flink-iceberg demo only has one Hadoop
common dependency. So I'm not sure why Flink throws such exception. Based on
Flink doc, I suppose that Flink binary doesn't include Hadoop dependencies,
right?
Based on the exception, looks like when FlinkCatalogFactor
Hi JING,
Thanks for the pointers.
1) I am able to debug why the variable `*numOfElements` *was getting reset
to 0. The following method of* AbstarctMapBundleOperator.java *was getting
called which was resetting the variable to 0 before it could reach max
count.
*@Overridepublic v
Thanks for replying.
I'm using Flink 1.12.x. And I think Iceberg 0.12 uses Flink 1.12 actually.
Once I upgraded the Iceberg from 0.11.0 to 0.12.0 for the Java application. I
got new exception as below:
java.lang.LinkageError: loader constraint violation: when resolving method
"org.apache.fli
Iceberg v0.11 or v0.12 not capable with flink v1.13.x.
L. C. Hsieh 于2021年8月21日周六 下午3:52写道:
> Hi, I'm testing using Flink to write Iceberg table. I run Flink native K8S
> cluster locally and submit a simple Java program that writes out Iceberg
> table (https://github.com/spancer/flink-iceberg-dem
Hi, I'm testing using Flink to write Iceberg table. I run Flink native K8S
cluster locally and submit a simple Java program that writes out Iceberg table
(https://github.com/spancer/flink-iceberg-demo). But got an exception:
java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration at