Hi, In my application, I am reading AVRO data from Kafka as GenericRecord and writing it to Iceberg tables. While reading the data from Kafka, I see the following stack trace:
Caused by: java.lang.ClassCastException: class java.lang.Integer cannot be cast to class java.lang.Long (java.lang.Integer and java.lang.Long are in module java.base of loader 'bootstrap') at org.apache.flink.table.data.GenericRowData.getLong(GenericRowData.java:154) at org.apache.flink.table.data.RowData.lambda$createFieldGetter$245ca7d1$7(RowData.java:249) at org.apache.flink.table.runtime.typeutils.RowDataSerializer.copyRowData(RowDataSerializer.java:170) at org.apache.flink.table.runtime.typeutils.RowDataSerializer.copy(RowDataSerializer.java:131) at org.apache.flink.table.runtime.typeutils.RowDataSerializer.copy(RowDataSerializer.java:48) at org.apache.flink.streaming.runtime.tasks.CopyingChainingOutput.pushToOperator(CopyingChainingOutput.java:74) Looking through the code, this seems like a bug in Flink, since the method does `return (long) this.fields[i])` where fields is an array of Object. So values are stored as Integer and not int, and need to be converted to Long or Number before casting to long. Is there a way to workaround this, using my own classes, or should I file a bug report and wait for a fix? Thanks, Munir