Thanks for the explanation, Matthias.

In the example you raised, would it be better to just keep both YARN and
K8S support in the new major version, but with YARN support deprecated if
we want to? We can say for YARN we will only provide bug fixes but no
feature development anymore. Given these two features are probably in two
independent modules. Keeping both modules in the same new major version
likely has zero additional cost compared with maintaining them in two
different major versions respectively. This way we don't have the
non-linear version issue, have fewer releases, and save a bunch of
maintenance effort for multiple development branches.

Regarding the stability demotion, I see your point. However, I still feel
it a little weird that we demote a Public API to PublicEvolving just for
the purpose of code removal in minor versions. This also results in some
counter intuitive issues. For example, assuming users only use Public APIs,
they may be able to upgrade from 1.19.0 to 2.0 fine, but upgrading from
1.19 to 2.1 does not work because the Public API is removed, even though
from the users' perspective, both of them are major version upgrades. So,
in this case, I would rather bump up the major version again to remove the
deprecated Public API. That seems simpler and does not complicate the well
established versioning semantic conventions.

Thanks,

Jiangjie (Becket) Qin

On Wed, Jun 14, 2023 at 9:27 PM Matthias Pohl
<matthias.p...@aiven.io.invalid> wrote:

> One (made-up) example from the top of my head would have been that the
> community decides to focus fully on Kubernetes without considering Yarn
> anymore because of some must-have feature on the Kubernetes side. At the
> same time there are still some users for whom it would be tricky to migrate
> from Yarn to Kubernetes. Therefore, there would be some desire to still
> maintain the older major version of Flink that supports Yarn.
>
> But now that I'm thinking about the example, I realize: A discussion on how
> we would handle the two major versions in that case, could be started when
> we actually run into this issue. It shouldn't be too hard to migrate to a
> non-linear versioning from what you are proposing if such a scenario comes
> up.
>
> And on the topic of downgrading from @Public to @PublicEvolving:
>
> Demoting a Public API to PublicEvolving API sounds hacky. From what I
> > understand the stability guarantee is not revocable because users will
> rely
> > on the stability guarantee to plan their usage of Flink. Demoting an API
> > essentially defeats the very purpose of stability guarantee to begin
> with.
> >
>
> I understand the stability guarantee in a way that it only applies within a
> major version. Downgrading the stability constraint for an API with a new
> major version still seems to comply with the definition of a @Public
> annotation as it's similar to changing the API in other ways. But I'm not
> insisting on that approach. I just found it a reasonable workaround.
>
> Thanks,
> Matthias
>
> On Wed, Jun 14, 2023 at 11:38 AM Becket Qin <becket....@gmail.com> wrote:
>
> > Hi Matthias,
> >
> > Thanks for the feedback.
> >
> > Do you have an example of behavioral change in mind? Not sure I fully
> > understand the concern for behavioral change here. From what I
> understand,
> > any user sensible change in an existing API, regardless of its kind (API
> > signature or behavioral change), can always be done in the following way:
> >
> > 1. Introduce a new API (new package, new class/interface, new method, new
> > config, new metric, etc) while marking the old one as deprecated.
> > 2. Let the new API and deprecated API coexist for the migration period to
> > allow planned migration from the users.
> > 3. Remove the deprecated API.
> >
> > For example, Kafka deprecated its old consumer and replaced it with a new
> > Consumer - basically everything changes. The source code of the old
> > consumer was kept there for a few years across multiple major versions.
> > This does mean we have to keep both of the APIs for a few releases, and
> > even fix bugs in the old consumer, so additional maintenance effort is
> > required. But this allows the users to keep up with Kafka releases which
> is
> > extremely rewarding.
> >
> > Thanks,
> >
> > Jiangjie (Becket) Qin
> >
> > On Wed, Jun 14, 2023 at 5:06 PM Matthias Pohl
> > <matthias.p...@aiven.io.invalid> wrote:
> >
> > > Thanks for starting this discussion, Becket. A few good points were
> > raised.
> > > Here's what I want to add:
> > >
> > > Stefan raised the point of behavioral stability (in contrast to API
> > > stability). That might be a reason for users to not be able to go ahead
> > > with a major version bump. Working around behavioral changes might be
> > > trickier than just switching from deprecated to newer APIs. I see your
> > > motivation of having a more linear versioning even between major
> versions
> > > to avoid backports. Backports are painful enough for minor versions.
> > >
> > > But with major versions becoming a thing in the Flink cosmos, I could
> > > imagine that the behavioral stability Stefan mentions actually could
> > become
> > > a bigger issue: Major versions down the road might include bigger
> > > behavioral changes which would prevent users from going ahead with the
> > > major version bump. I understand that this is out of the original scope
> > of
> > > this FLIP. But nevertheless, it does support Chesnay's concerns that a
> > > linear versioning without maintaining older major versions might not be
> > > feasible. It sounds like we should have a discussion about how we treat
> > > older major versions here (or have a dedicated discussion on that topic
> > > before going ahead with that FLIP).
> > >
> > > On another note: I like Xintong's proposal of downgrading an API
> > > from @Public to @PublicEvolving in the new major version. That would
> > allow
> > > us to keep the original intention of the @Public annotation alive (i.e.
> > > that those APIs are only removed in the next major version).
> > >
> > > Matthias
> > >
> > > On Wed, Jun 14, 2023 at 10:10 AM Xintong Song <tonysong...@gmail.com>
> > > wrote:
> > >
> > > > Thanks for bringing up this discussion, Becket.
> > > >
> > > > My two cents:
> > > >
> > > > 1. Do we allow deprecation & removal of APIs without adding a new one
> > as
> > > a
> > > > replacement? The examples in the table give me an impression that
> > marking
> > > > an API as `@Deprecated` should only happen at the same time of
> > > introducing
> > > > a new replacing API, which I think is true in most but not all the
> > cases.
> > > >
> > > > If there is a major version bump before 2 minor releases in the
> current
> > > > > major version are reached, the major version should keep the source
> > > code
> > > > in
> > > > > its own minor version until two minor versions are reached. For
> > > example,
> > > > in
> > > > > the above case, if Flink 2.0 is released after 1.20, then the
> > > deprecated
> > > > > source code of foo will be kept in 2.0 and all the 2.x versions. It
> > can
> > > > > only be removed in 3.0.
> > > > >
> > > >
> > > > 2. I think this might be a bit too strict. For an API that we already
> > > > decided to remove, having to keep it for all the 2.x versions simply
> > > > because there's less than 2 minor releases between making the
> decision
> > > and
> > > > the major release bump seems not necessarily. Alternatively, I'd like
> > to
> > > > propose to remove the `@Public` annotation (or downgrade it to
> > > > `@PublicEvolving`) in 2.0, and remove it in 2.2.
> > > >
> > > > Best,
> > > >
> > > > Xintong
> > > >
> > > >
> > > >
> > > > On Wed, Jun 14, 2023 at 3:56 PM Becket Qin <becket....@gmail.com>
> > wrote:
> > > >
> > > > > Hi Jing,
> > > > >
> > > > > Thanks for the feedback. Please see the answers to your questions
> > > below:
> > > > >
> > > > > *"Always add a "Since X.X.X" comment to indicate when was a class /
> > > > > > interface / method marked as deprecated."*
> > > > > >  Could you describe it with a code example? Do you mean Java
> > > comments?
> > > > >
> > > > > It is just a comment such as "Since 1.18. Use X
> > > > > <
> > > > >
> > > >
> > >
> >
> https://kafka.apache.org/31/javadoc/org/apache/kafka/clients/admin/Admin.html#incrementalAlterConfigs(java.util.Map)
> > > > > >XX
> > > > > instead.". And we can then look it up in the deprecated list[1] in
> > each
> > > > > release and see which method should / can be deprecated.
> > > > >
> > > > > *"At least 1 patch release for the affected minor release for
> > > > > > Experimental APIs"*
> > > > > > The rule is absolutely right. However, afaiac, deprecation is
> > > different
> > > > > as
> > > > > > modification. As a user/dev, I would appreciate, if I do not need
> > to
> > > do
> > > > > any
> > > > > > migration work for any deprecated API between patch releases
> > upgrade.
> > > > > BTW,
> > > > > > if experimental APIs are allowed to change between patches, could
> > we
> > > > just
> > > > > > change them instead of marking them as deprecated and create new
> > ones
> > > > to
> > > > > > replace them?
> > > > >
> > > > > Deprecating an API is just a more elegant way of replacing an API
> > with
> > > a
> > > > > new one. The only difference between the two is whether the old API
> > is
> > > > kept
> > > > > and coexists with the new API for some releases or not. For end
> > users,
> > > > > deprecation-then-remove is much more friendly than direct
> > replacement.
> > > > >
> > > > > 1. How to make sure the new APIs cover all functionality, i.e.
> > backward
> > > > > > compatible, before removing the deprecated APIs? Since the
> > > > > > functionalities could only be built with the new APIs
> iteratively,
> > > > there
> > > > > > will be a while (might be longer than the migration period) that
> > the
> > > > new
> > > > > > APIs are not backward compatible with the deprecated ones.
> > > > >
> > > > > This is orthogonal to the deprecation process, and may not even be
> > > > required
> > > > > in some cases if the function changes by itself. But in general,
> this
> > > > > relies on the developer to decide. A simple test on readiness is to
> > see
> > > > if
> > > > > all the UT / IT cases written with the old API can be migrated to
> the
> > > new
> > > > > one and still work.  If the new API is not ready, we probably
> should
> > > not
> > > > > deprecate the old one to begin with.
> > > > >
> > > > > 2. Is it allowed to remove the deprecated APIs after the defined
> > > > migration
> > > > > > period expires while the new APis are still not backward
> > compatible?
> > > > >
> > > > > By "backwards compatible", do you mean functionally equivalent? If
> > the
> > > > new
> > > > > APIs are designed to be not backwards compatible, then removing the
> > > > > deprecated source code is definitely allowed. If we don't think the
> > new
> > > > API
> > > > > is ready to take over the place for the old one, then we should
> wait.
> > > The
> > > > > migration period is the minimum time we have to wait before
> removing
> > > the
> > > > > source code. A longer migration period is OK.
> > > > >
> > > > > 3. For the case of core API upgrade with downstream
> implementations,
> > > e.g.
> > > > > > connectors, What is the feasible deprecation strategy? Option1
> > > > bottom-up:
> > > > > > make sure the downstream implementation is backward compatible
> > before
> > > > > > removing the deprecated core APIs. Option2 top-down: once the
> > > > downstream
> > > > > > implementation of new APIs works fine, we can remove the
> deprecated
> > > > core
> > > > > > APIs after the migration period expires. The implementation of
> the
> > > > > > deprecated APIs will not get any further update in upcoming
> > > releases(it
> > > > > has
> > > > > > been removed). There might be some missing features in the
> > downstream
> > > > > > implementation of new APIs compared to the old implementation.
> Both
> > > > > options
> > > > > > have their own pros and cons.
> > > > >
> > > > > The downstream projects such as connectors in Flink should also
> > follow
> > > > the
> > > > > migration path we tell our users. i.e. If there is a cascading
> > > backwards
> > > > > incompatible change in the connectors due to a backwards
> incompatible
> > > > > change in the core, and as a result a longer migration period is
> > > > required,
> > > > > then I think we should postpone the removal of source code. But in
> > > > general,
> > > > > we should be able to provide the same migration period in the
> > > connectors
> > > > as
> > > > > the flink core, if the connectors are upgraded to the latest
> version
> > of
> > > > > core promptly.
> > > > >
> > > > > Thanks,
> > > > >
> > > > > Jiangjie (Becket) Qin
> > > > >
> > > > > [1]
> > > > >
> > > > >
> > > >
> > >
> >
> https://nightlies.apache.org/flink/flink-docs-master/api/java/deprecated-list.html
> > > > >
> > > > >
> > > > > On Wed, Jun 14, 2023 at 1:15 AM Jing Ge <j...@ververica.com.invalid
> >
> > > > > wrote:
> > > > >
> > > > > > > This is by design. Most of these are @Public APIs that we had
> to
> > > > carry
> > > > > > > around until Flink 2.0, because that was the initial guarantee
> > that
> > > > we
> > > > > > > gave people.
> > > > > > >
> > > > > >
> > > > > > True, I knew @Public APIs could not be removed before the next
> > major
> > > > > > release. I meant house cleaning without violation of these
> > > annotations'
> > > > > > design concept. i.e especially cleaning up for @PublicEvolving
> APIs
> > > > since
> > > > > > they are customer-facing. Regular cleaning up with all other
> > > > > @Experimental
> > > > > > and @Internal APIs would be even better, if there might be some
> > APIs
> > > > > marked
> > > > > > as @deprecated.
> > > > > >
> > > > > > Best regards,
> > > > > > Jing
> > > > > >
> > > > > >
> > > > > >
> > > > > > On Tue, Jun 13, 2023 at 4:25 PM Chesnay Schepler <
> > ches...@apache.org
> > > >
> > > > > > wrote:
> > > > > >
> > > > > > > On 13/06/2023 12:50, Jing Ge wrote:
> > > > > > > > One major issue we have, afaiu, is caused by the lack of
> > > > > > > housekeeping/house
> > > > > > > > cleaning, there are many APIs that were marked as deprecated
> a
> > > few
> > > > > > years
> > > > > > > > ago and still don't get removed. Some APIs should be easy to
> > > remove
> > > > > and
> > > > > > > > others will need some more clear rules, like the issue
> > discussed
> > > at
> > > > > > [1].
> > > > > > >
> > > > > > This is by design. Most of these are @Public APIs that we had to
> > > carry
> > > > > > > around until Flink 2.0, because that was the initial guarantee
> > that
> > > > we
> > > > > > > gave people.
> > > > > >
> > > > > >
> > > > > > >
> > > > > > > As for the FLIP, I like the idea of explicitly writing down a
> > > > > > > deprecation period for APIs, particularly PublicEvolving ones.
> > > > > > > For Experimental I don't think it'd be a problem if we could
> > change
> > > > > them
> > > > > > > right away,
> > > > > > > but looking back a bit I don't think it hurts us to also
> enforce
> > > some
> > > > > > > deprecation period.
> > > > > > > 1 release for both of these sound fine to me.
> > > > > > >
> > > > > > >
> > > > > > > My major point of contention is the removal of Public APIs
> > between
> > > > > minor
> > > > > > > versions.
> > > > > > > This to me would a major setback towards a simpler upgrade path
> > for
> > > > > > users.
> > > > > > > If these can be removed in minor versions than what even is a
> > major
> > > > > > > release?
> > > > > > > The very definition we have for Public APIs is that they stick
> > > around
> > > > > > > until the next major one.
> > > > > > > Any rule that theoretically allows for breaking changes in
> Public
> > > API
> > > > > in
> > > > > > > every minor release is in my opinion not a viable option.
> > > > > > >
> > > > > > >
> > > > > > > The "carry recent Public APIs forward into the next major
> > release"
> > > > > thing
> > > > > > > seems to presume a linear release history (aka, if 2.0 is
> > released
> > > > > after
> > > > > > > 1.20, then there will be no 1.21), which I doubt will be the
> > case.
> > > > The
> > > > > > > idea behind it is good, but I'd say the right conclusion would
> be
> > > to
> > > > > not
> > > > > > > make that API public if we know a new major release hits in 3
> > > months
> > > > > and
> > > > > > > is about to modify it. With a regular schedule for major
> releases
> > > > this
> > > > > > > wouldn't be difficult to do.
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
>

Reply via email to