Hi Zdenek, the schema evolution can be tricky in general. I would suggest to plan possible schema extensions in advance, use more custom serialisation and make sure that it supports the required types of evolution. E.g. some custom Avro serialiser might tolerate better adding a field with a default value. You could have a look into TypeSerializer abstract class and its API for TypeSerializer.snapshotConfiguration and TypeSerializer.ensureCompatibility. In certain cases state migration might be unavoidable. At the moment the community is working on better support of state migration by Flink. Currently you still might have to write e.g. your own job to convert/migrate state in savepoint.
Best, Andrey > On 17 Sep 2018, at 10:24, tisonet <tis...@seznam.cz> wrote: > > Hi Andrey > > thanks for answer. > > It seems that is not possible to handle case class evolution in version > which I currently use (1.5.3). > Do you have any recommendation how to avoid such problem in future? Adding a > new field with default value to an existing class seems to me as a common > use case. Can I use custom class, pojo class or Avro serde? > > > Thanks Zdenek. > > > > -- > Sent from: > http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/