Thank you all for your feedback.

Let's leave this open for another day and unless there is any negative
feedback we can go ahead with merging the PR to bump the version for 2.0

Cheers
Gyula

On Wed, Feb 26, 2025 at 10:56 AM Jing Ge <j...@ververica.com.invalid> wrote:

> Thanks for bringing this to our attention! I would choose simplicity over
> backward compatibility given Flink 2.0 offers the opportunity for breaking
> changes. We will benefit from it from long-term's perspective. +1 for
> upgrading Kryo in Flink 2.0 in a non compatible way.
>
> Best regards,
> Jing
>
> On Wed, Feb 26, 2025 at 5:37 AM Nick Nezis <nickne...@apache.org> wrote:
>
> > Thanks Martijn.
> >
> > That's really great context. In that case, then I'll change my previous
> > opinion. I agree that we should proceed with the simpler pull request and
> > get it into the Flink 2.0 release.
> >
> > On 2025/02/25 14:06:20 Martijn Visser wrote:
> > > Hi all,
> > >
> > > For the record, I don't think we have a guarantee around backwards
> > > compatibility for Flink 2.0 anyway, given that we upgraded Scala to the
> > > latest version (because of the bump to JDK 17) and that will
> potentially
> > > break savepoints when using Scala. So I think we should also put this
> in
> > > for Flink 2.0, and just have the right release notes/documentation for
> > this.
> > >
> > > Best regards,
> > >
> > > Martijn
> > >
> > > On Tue, Feb 25, 2025 at 3:31 AM Zhanghao Chen <
> zhanghao.c...@outlook.com
> > >
> > > wrote:
> > >
> > > > Hi Gyula,
> > > >
> > > > Thanks for bringing this up! Definitely +1 for upgrading Kryo in
> Flink
> > > > 2.0. As a side note, it might be useful to introduce customizable
> > generic
> > > > serializer support like Spark, where you can switch to your own
> > serializer
> > > > via the "spark.serializer" [1] option. Users starting new
> applications
> > can
> > > > introduce their own serialization stack in this case to resolve Java
> > > > compatibility issue is this case or for other performance issues.
> > > >
> > > > [1] https://spark.apache.org/docs/latest/configuration.html
> > > >
> > > >
> > > > Best,
> > > > Zhanghao Chen
> > > > ________________________________
> > > > From: Gyula F?ra <gyula.f...@gmail.com>
> > > > Sent: Friday, February 21, 2025 14:04
> > > > To: dev <dev@flink.apache.org>
> > > > Subject: [DISCUSSION] Upgrade to Kryo 5 for Flink 2.0
> > > >
> > > > Hey all!
> > > >
> > > > I would like to rekindle this discussion as it seems that it has
> > stalled
> > > > several times in the past and we are nearing the point in time where
> > the
> > > > decision has to be made with regards to 2.0. (we are already a bit
> > late but
> > > > nevermind)
> > > >
> > > > There has been numerous requests and efforts to upgrade Kryo to
> better
> > > > support newer Java versions and Java native types. I think we can all
> > agree
> > > > that this change is inevitable one way or another.
> > > >
> > > > The latest JIRA for this seems to be:
> > > > https://issues.apache.org/jira/browse/FLINK-3154
> > > >
> > > > There is even an open PR that accomplishes this (currently in a state
> > > > incompatible way) but based on the discussion it seems that with some
> > extra
> > > > complexity compatibility can even be preserved by having both the old
> > and
> > > > new Kryo versions active at the same time.
> > > >
> > > > The main question here is whether state compatibility is important
> for
> > 2.0
> > > > with this regard or we want to bite the bullet and make this upgrade
> > once
> > > > and for all.
> > > >
> > > > Cheers,
> > > > Gyula
> > > >
> > >
> >
>

Reply via email to