Hi Gyula, Sorry for chime in late, just noticed the thread. And thanks for bringing this up.
As one of the release managers, +1 for including this in Flink 2.0. - Flink 2.0 does not guarantee state compatibility. We've mentioned that in the 2.0-preview release announcement. In addition to the Scala upgrading as Martijn mentioned, we also introduced built-in serializer for collections types as default, which is also compatibility breaking. - Although the feature freeze date has already passed, I think there's a good reason to make this effort an exception. We've been preparing this release for ~2 years. Why not wait for another 1-2 weeks, if that really helps? IIRC, this has come up somewhere when we plan the release, and was not included only because no one was driving the effort. So it's really nice to see it being picked up. Just for managing the expectation, do we have an estimation on how long this may take? A related topic is, can we provide a state migration tool to help users migrate across incompatible checkpoints / savepoints? I.e., deserialize the states into in-memory objects with old serialization setups, and serialize them with new setups. Such a tool can be helpful not only in a state-incompatible major version bump, but also in scenarios such as: - The user wants to change the serialization setups - State incompatibility due to user logic change, which the user knows how to deal with but lacks methods to do it. E.g., adding a new field to the state data type, where a certain default value can be applied to the previous states. I'm not entirely sure whether this can work or not. Just lack the capacity to look into it. Best, Xintong On Wed, Feb 26, 2025 at 6:02 PM Gyula Fóra <gyula.f...@gmail.com> wrote: > Thank you all for your feedback. > > Let's leave this open for another day and unless there is any negative > feedback we can go ahead with merging the PR to bump the version for 2.0 > > Cheers > Gyula > > On Wed, Feb 26, 2025 at 10:56 AM Jing Ge <j...@ververica.com.invalid> > wrote: > > > Thanks for bringing this to our attention! I would choose simplicity over > > backward compatibility given Flink 2.0 offers the opportunity for > breaking > > changes. We will benefit from it from long-term's perspective. +1 for > > upgrading Kryo in Flink 2.0 in a non compatible way. > > > > Best regards, > > Jing > > > > On Wed, Feb 26, 2025 at 5:37 AM Nick Nezis <nickne...@apache.org> wrote: > > > > > Thanks Martijn. > > > > > > That's really great context. In that case, then I'll change my previous > > > opinion. I agree that we should proceed with the simpler pull request > and > > > get it into the Flink 2.0 release. > > > > > > On 2025/02/25 14:06:20 Martijn Visser wrote: > > > > Hi all, > > > > > > > > For the record, I don't think we have a guarantee around backwards > > > > compatibility for Flink 2.0 anyway, given that we upgraded Scala to > the > > > > latest version (because of the bump to JDK 17) and that will > > potentially > > > > break savepoints when using Scala. So I think we should also put this > > in > > > > for Flink 2.0, and just have the right release notes/documentation > for > > > this. > > > > > > > > Best regards, > > > > > > > > Martijn > > > > > > > > On Tue, Feb 25, 2025 at 3:31 AM Zhanghao Chen < > > zhanghao.c...@outlook.com > > > > > > > > wrote: > > > > > > > > > Hi Gyula, > > > > > > > > > > Thanks for bringing this up! Definitely +1 for upgrading Kryo in > > Flink > > > > > 2.0. As a side note, it might be useful to introduce customizable > > > generic > > > > > serializer support like Spark, where you can switch to your own > > > serializer > > > > > via the "spark.serializer" [1] option. Users starting new > > applications > > > can > > > > > introduce their own serialization stack in this case to resolve > Java > > > > > compatibility issue is this case or for other performance issues. > > > > > > > > > > [1] https://spark.apache.org/docs/latest/configuration.html > > > > > > > > > > > > > > > Best, > > > > > Zhanghao Chen > > > > > ________________________________ > > > > > From: Gyula F?ra <gyula.f...@gmail.com> > > > > > Sent: Friday, February 21, 2025 14:04 > > > > > To: dev <dev@flink.apache.org> > > > > > Subject: [DISCUSSION] Upgrade to Kryo 5 for Flink 2.0 > > > > > > > > > > Hey all! > > > > > > > > > > I would like to rekindle this discussion as it seems that it has > > > stalled > > > > > several times in the past and we are nearing the point in time > where > > > the > > > > > decision has to be made with regards to 2.0. (we are already a bit > > > late but > > > > > nevermind) > > > > > > > > > > There has been numerous requests and efforts to upgrade Kryo to > > better > > > > > support newer Java versions and Java native types. I think we can > all > > > agree > > > > > that this change is inevitable one way or another. > > > > > > > > > > The latest JIRA for this seems to be: > > > > > https://issues.apache.org/jira/browse/FLINK-3154 > > > > > > > > > > There is even an open PR that accomplishes this (currently in a > state > > > > > incompatible way) but based on the discussion it seems that with > some > > > extra > > > > > complexity compatibility can even be preserved by having both the > old > > > and > > > > > new Kryo versions active at the same time. > > > > > > > > > > The main question here is whether state compatibility is important > > for > > > 2.0 > > > > > with this regard or we want to bite the bullet and make this > upgrade > > > once > > > > > and for all. > > > > > > > > > > Cheers, > > > > > Gyula > > > > > > > > > > > > > > >