Live updating Serialization Schemas in Flink
Hi flink-users! I need advice about how to tackle a programming problem I’m facing. I have a bunch of jobs that look something like this sketch Source kafkaSource; kafkaSource .map(function that takes generic record) .map( ... ) ... .sink(kafka sink that takes in generic records) T
Flink 1.10 permanent JVM hang when stopped
ticket if anyones curious. Thanks! -Hunter Herman