I might be misunderstanding Flink Avro support. I assumed not including a field in "CREATE TABLE" would work fine. If I leave out any field before a nested row, "CREATE TABLE" fails. If I include all of the fields, this succeeds. I assumed fields would be optional.
I'm using Flink v1.11.1 with the Table SQL API. *Problem* If I do not include one of the fields, I get the following exception. If I add back the missing field, "contentId", this works. "CREATE TABLE `default.mydb.mytable` (\n" + "`userId` STRING, \n" + "`timeEpochMillis` BIGINT, \n" + //"`contentId` BIGINT, \n" + "`contentDetails` ROW<\n" + "`contentId` BIGINT >\n" + ") WITH (...)\n" Caused by: java.lang.ClassCastException: java.lang.Long cannot be cast to org.apache.avro.generic.IndexedRecord at org.apache.flink.formats.avro.AvroRowDataDeserializationSchema .lambda$createRowConverter$80d8b6bd$1(AvroRowDataDeserializationSchema.java: 203) at org.apache.flink.formats.avro.AvroRowDataDeserializationSchema .lambda$createNullableConverter$c3bac5d8$1(AvroRowDataDeserializationSchema .java:221) at org.apache.flink.formats.avro.AvroRowDataDeserializationSchema .lambda$createRowConverter$80d8b6bd$1(AvroRowDataDeserializationSchema.java: 206) at org.apache.flink.formats.avro. AvroFileSystemFormatFactory$RowDataAvroInputFormat.nextRecord( AvroFileSystemFormatFactory.java:204) at org.apache.flink.streaming.api.functions.source. InputFormatSourceFunction.run(InputFormatSourceFunction.java:91) at org.apache.flink.streaming.api.operators.StreamSource.run( StreamSource.java:100) at org.apache.flink.streaming.api.operators.StreamSource.run( StreamSource.java:63) at org.apache.flink.streaming.runtime.tasks. SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:201)