Hi Zdenek,
the schema evolution can be tricky in general.
I would suggest to plan possible schema extensions in advance,
use more custom serialisation and make sure that it supports the required types
of evolution.
E.g. some custom Avro serialiser might tolerate better adding a field with a
de
Hi Andrey
thanks for answer.
It seems that is not possible to handle case class evolution in version
which I currently use (1.5.3).
Do you have any recommendation how to avoid such problem in future? Adding a
new field with default value to an existing class seems to me as a common
use case. Can
Hi,
Adding a new field to a case class breaks serialisation format in savepoint at
the moment and requires state migration which is currently not supported in
Flink implicitly.
Although, I would expect the failure earlier while performing the compatibility
check upon restore.
According to the
Hi,
I run simple streaming job where I compute hourly impressions for campaigns:
.keyBy(imp => imp.campaign_id)
.window(TumblingEventTimeWindows.of(...))
.aggregate(new BudgetSpendingByImpsAggregateFunction(), new
BudgetSpendingByImpsWindowFunction())
Where aggregate function just sums impressio