I recently started working on a PoC with Flink 1.3 that connects to our Kafka cluster and pulls Avro data. Here's the code:
ConsoleAppendStreamTableSink is just a simple TableSink I created while looking at the different table sinks types. It just calls print() on the DataStream. Ping is an auto-generated SpecificRecord from an Avro schema. I'm getting a NotSerializableException inside FlinkKafkaConsumerBase, specifically on the SpecificDatumReader it uses internally. Here's the exception: This looks to be a straight forward use of the Kafka 09 connector, so I'm not sure why I'm running into a serialization issue. Am I missing something obvious? I'm running this with the debugger inside IntelliJ, not on a cluster, though I'm not sure why that would matter. Any help is greatly appreciated! Morrigan Jones -- View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Avro-serialization-issue-with-Kafka09AvroTableSource-tp14974.html Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.