Thanks for the question, Matthias. My two cents, I don't think we are blocking new feature development. My understanding is that the community will just prioritize removing deprecated APIs in the 2.0 dev cycle. Because of that, it is possible that some new feature development may slow down a little bit since some contributors may be working on the must-have features for 2.0. But policy wise, I don't see a reason to block the new feature development for the 2.0 release feature plan[1].
Process wise, I like your idea of adding the new features as nice-to-have in the 2.0 feature list. Re: David, Given it is a major version bump. It is possible that some of the downstream projects (e.g. connectors, Paimon, etc) will have to see if a major version bump is also needed there. And it is probably going to be decisions made on a per-project basis. Regarding the Java version specifically, this probably worth a separate discussion. According to a recent report[2] on the state of Java, it might be a little early to drop support for Java 11. We can discuss this separately. Thanks, Jiangjie (Becket) Qin [1] https://cwiki.apache.org/confluence/display/FLINK/2.0+Release [2] https://newrelic.com/sites/default/files/2024-04/new-relic-state-of-the-java-ecosystem-report-2024-04-30.pdf On Tue, Jun 25, 2024 at 4:58 AM David Radley <david_rad...@uk.ibm.com> wrote: > Hi, > I think this is a great question. I am not sure if this has been covered > elsewhere, but it would be good to be clear how this effects the connectors > and operator repos, with potentially v1 and v2 oriented new featuresI > suspect this will be a connector by connector investigation. I am thinking > connectors with Hadoop eco-system dependencies (e.g. Paimon) which may not > work nicely with Java 17, > > Kind regards, David. > > > From: Matthias Pohl <map...@apache.org> > Date: Tuesday, 25 June 2024 at 09:57 > To: dev@flink.apache.org <dev@flink.apache.org> > Cc: Xintong Song <tonysong...@gmail.com>, martijnvis...@apache.org < > martijnvis...@apache.org>, imj...@gmail.com <imj...@gmail.com>, > becket....@gmail.com <becket....@gmail.com> > Subject: [EXTERNAL] [2.0] How to handle on-going feature development in > Flink 2.0? > Hi 2.0 release managers, > With the 1.20 release branch being cut [1], master is now referring to > 2.0-SNAPSHOT. I remember that, initially, the community had the idea of > keeping the 2.0 release as small as possible focusing on API changes [2]. > > What does this mean for new features? I guess blocking them until 2.0 is > released is not a good option. Shall we treat new features as > "nice-to-have" items as documented in the 2.0 release overview [3] and > merge them to master like it was done for minor releases in the past? Do > you want to add a separate section in the 2.0 release overview [3] to list > these new features (e.g. FLIP-461 [4]) separately? That might help to > manage planned 2.0 deprecations/API removal and new features separately. Or > do you have a different process in mind? > > Apologies if this was already discussed somewhere. I didn't manage to find > anything related to this topic. > > Best, > Matthias > > [1] https://lists.apache.org/thread/mwnfd7o10xo6ynx0n640pw9v2opbkm8l > [2] https://lists.apache.org/thread/b8w5cx0qqbwzzklyn5xxf54vw9ymys1c > [3] https://cwiki.apache.org/confluence/display/FLINK/2.0+Release > [4] > > https://cwiki.apache.org/confluence/display/FLINK/FLIP-461%3A+Synchronize+rescaling+with+checkpoint+creation+to+minimize+reprocessing+for+the+AdaptiveScheduler > > Unless otherwise stated above: > > IBM United Kingdom Limited > Registered in England and Wales with number 741598 > Registered office: PO Box 41, North Harbour, Portsmouth, Hants. PO6 3AU >