This has been merged now (both master and release-2.0).
Thank you all for your feedback!
Gyula
On Thu, Feb 27, 2025 at 3:41 AM Zakelly Lan wrote:
> Sry for the late reply, also +1 to have this in 2.0 given that we don't
> guarantee backwards compatibility, and it is already not compatible in m
Sry for the late reply, also +1 to have this in 2.0 given that we don't
guarantee backwards compatibility, and it is already not compatible in many
ways. Looking forward to this.
Best,
Zakelly
On Thu, Feb 27, 2025 at 9:20 AM Xintong Song wrote:
> Thank you both for the efforts. The progress so
Thank you both for the efforts. The progress sounds great.
JFYI, there are still a few other blocker issues, and based on the progress
of resolving them my estimation would be creating RC1 around next Friday.
So for this feature there's no need to rush. Take your time for code
reviewing and testin
A state migration tool that can migrate savepoints/checkpoints is a great idea
that would be very useful for a lot of scenarios. But that is a bigger scope
issue.
(fyi, CI is passing and PR should be fully ready)
The only reason to keep the old version of Kryo and not either upgrade or
remove Kryo is if there is a backward compatibility advantage. If backward
compatibility is breaking anyway, this seems an easy choice.
Java record support was the big motivation to this upgrade. From some simple
tests I'
; > > > >
> > > > > > Hi Gyula,
> > > > > >
> > > > > > Thanks for bringing this up! Definitely +1 for upgrading Kryo in
> > > Flink
> > > > > > 2.0. As a side note, it might be useful to introduce customizable
> > > > generic
> > > >
he "spark.serializer" [1] option. Users starting new
> > applications
> > > can
> > > > > introduce their own serialization stack in this case to resolve
> Java
> > > > > compatibility issue is this case or for other performance issues.
&g
"spark.serializer" [1] option. Users starting new
> applications
> > can
> > > > introduce their own serialization stack in this case to resolve Java
> > > > compatibility issue is this case or for other performance issues.
> > > >
> > &g
case or for other performance issues.
> > >
> > > [1] https://spark.apache.org/docs/latest/configuration.html
> > >
> > >
> > > Best,
> > > Zhanghao Chen
> > >
> > > From: Gyula F?ra
> &g
lization stack in this case to resolve Java
> > compatibility issue is this case or for other performance issues.
> >
> > [1] https://spark.apache.org/docs/latest/configuration.html
> >
> >
> > Best,
> > Zhanghao Chen
> > _______
e is this case or for other performance issues.
>
> [1] https://spark.apache.org/docs/latest/configuration.html
>
>
> Best,
> Zhanghao Chen
>
> From: Gyula F?ra
> Sent: Friday, February 21, 2025 14:04
> To: dev
> Subject: [DISC
nt: Friday, February 21, 2025 14:04
To: dev
Subject: [DISCUSSION] Upgrade to Kryo 5 for Flink 2.0
Hey all!
I would like to rekindle this discussion as it seems that it has stalled
several times in the past and we are nearing the point in time where the
decision has to be made with regards to 2.0. (we a
Perhaps we should provide the Backward compatible version in the initial 2.0
release, and provided a deprecation plan for simplifying the code and only
supporting the newer Kryo 5.x in Flink 2.1+. This would be inline with the
previous decision provided in the FLIP's compatibility and deprecatio
Thanks for chiming in Timo, I completely agree.
There seem to be 2 approaches available here:
1. Backward compatible: https://github.com/apache/flink/pull/22660
2. Non compatible https://github.com/apache/flink/pull/25896
They both seem to be feasible however the backward compatible approach is
m
Hi Gyula,
thanks for bringing this up. I agree that we should use one of the most
recent Kryo versions for Flink 2.0. Otherwise the community will suffer
again from incompatibilities and needs to wait for another major Flink
version. Thanks for starting this discussion. We should do it for 2.0
Hey!
I think the main question here is whether Flink 2.0 is going to be state
backward compatible or not.
If not then we need to make the upgrade right now, freeze or not. We have
to decide this as a community.
If we need to preserve backward compatibility then we need to go with a
much more comp
Thanks Gyula for driving this discussion!
+1 for upgrading kyro, I have a question about the timeline.
The flink 2.0.0 has been freezed, do we have enough time to
test if it's done in flink 2.0.0?
Best,
Rui
On Fri, Feb 21, 2025 at 9:55 PM Alexander Fedulov
wrote:
> Hi Gyula,
>
> Thanks for bri
Hi Gyula,
Thanks for bringing up this topic! Kryo incompatibility with newer
Java versions is a major issue that needs to be addressed, and in my
opinion, Flink 2.0 provides a great opportunity to introduce this
change.
My understanding is that state compatibility was not a strict goal
during 2.0
Hey all!
I would like to rekindle this discussion as it seems that it has stalled
several times in the past and we are nearing the point in time where the
decision has to be made with regards to 2.0. (we are already a bit late but
nevermind)
There has been numerous requests and efforts to upgrade
19 matches
Mail list logo