Hello all,

I saw the recent updates in Flink related to supporting Avro schema evolution 
in state. I'm curious how Flink handles this internally for Scala case classes. 
I'm working on custom (de-)serialization schema's to write and read from Kafka. 
However, I'm currently stuck because of the fact that Avro doesn't natively 
support Scala. This means that in order to support case class serialization 
using Scala specific types (like Option, Either, etc.) I need a library like 
Avro4s [1] or AvroHugger [2] which on compile-time generates schemas using 
macros. These macro-extensions are extremely slow for complex case classes 
(compile-time of 15 minutes for a few nested types). I'm looking for an 
approach without the use of these libraries and therefore curious how Flink 
handles this.


Does anyone has some good leads for this?


Thanks in advance!


Kind regards,
Wouter Zorgdrager


[1] https://github.com/sksamuel/avro4s
[2] https://github.com/julianpeeters/avrohugger

Reply via email to