Hi Leonard, Thanks for your comments.
I will try not to focus on the other connectors and the apparent widespread abandonment they seem to be facing (with the exception of the most commonly used ones). I believe that could give us a separated discussion. Regarding the matter at hand, I find your point about wanting to have both in the same project completely valid. However, that also becomes the connector's greatest limitation. It is perfectly feasible for a project like this to wait for all of its connectors to evolve before upgrading, but this also limits the evolution of the embedded connector itself. In this case, we are not moving to Flink 2.0 to wait for the other connectors, which in turn also limits the Debezium connector. My main concern is that the evolution of the FlinkCDC application shouldn't block the evolution of its primary connector. Perhaps we can likely reduce the problem to the Java version "limitation" imposed by Flink: - For a standalone connector, is it necessary to adhere to that limitation? I remember a discussion about a connector where the minimum Java version was raised to match the connector's (as long as Flink supported it). This implies your job would have to run on that Java version, but if you're already using that connector, it shouldn't be a problem. - For FlinkCDC, I understand we want to keep the version as close as possible to Flink's minimum, which makes a lot of sense. As for use cases, there are others, as example we use the connector standalone in various scenarios, which I would be happy to discuss with you offline if you are interested. On 2025/07/27 05:50:13 Leonard Xu wrote: > Hey Joao, > > Thanks for kicking off this thread, I have some comments as follows: > > > Currently, Flink CDC uses Debezium 1.9, which is the last version > > compatible with Java 8. While an upgrade to Flink 2.0 will likely enable a > > move to a newer version, we would still be constrained to Debezium 2.7 (the > > last version supporting Java 11). > > This is already part of Flink CDC's roadmap — we've discussed upgrading Flink > to version 2.0 [1]. However, like other independent connector repositories, > we prefer to wait until Flink 2.0 becomes more stable before bumping both the > Flink version and the Debezium version to 2.7. > > > With the main Debezium project now at > > version 3.2 (requiring Java 17), the Flink community is unable to leverage > > many valuable evolutions. These features may not be critical to the core > > In this case, we will still encounter the Java 17 issue regardless of whether > we choose to decouple the Debezium connector from Flink or upgrade the > Debezium version within Flink CDC, since both approaches depend on Flink, > which currently does not support Java 17, right? > > So why don't we directly accelerate the upgrade to Flink 2.0 and Debezium 2.7 > in the Flink CDC project? To be honest, I'm not particularly in favor of > splitting the connectors into external projects and releasing them > independently. > > (1) Frankly, Flink's external connectors currently lack sufficient community > support for maintenance. For instance, HBase is a sufficiently influential > project within the big data ecosystem. However, the latest supported Flink > version is 1.19.x, which was released last year [2]. I've seen contributors > in the community pushing for adaptation to Flink 2.0. Without enough > dedicated committers and PMCs willing to invest time and effort into > developing and maintaining these connectors, their future development cannot > be guaranteed. > > (2) The Flink CDC project has been steadily releasing versions and fixing > various issues [3]. This is because contributors to the Flink CDC project, > including myself, come from multiple companies and use Flink CDC extensively > in our respective organizations. We have the motivation and resources to > invest in this project, enabling its healthy growth. > > (3) Currently, the primary use case for flink-connector-debezium is still > within the various CDC connectors in the Flink CDC project. I haven't seen > many other usage scenarios. Even if there are, as mentioned in my previous > response, the motivation to drive solutions would likely be stronger within > the Flink CDC project itself. > > Overall, I still prefer concentrating the community's limited resources on > jointly maintaining a single repository. I don't want to see situations where > Flink is already planning for version 2.2, while certain community-maintained > connectors remain stuck on version 1.16. > > > Best, > Leonard > > [1]https://lists.apache.org/thread/n7zq6yt31s9xy77wrqtv2wxdn6gv5ytm > [2] https://github.com/apache/flink-connector-hbase > [3] https://github.com/apache/flink-cdc/releases > > > > > > >