Hi, Hang.
I think what you're actually trying to say is Kudu & HBase & RabbitMQ.
1. For Hbase, I see there is a pr to bump 1.18 and support 1.18/1.19, We are 
currently discussing in the community[2] whether to show support for adding 
1.20.
2. For RabbitMQ, I see there is a pr to bump 1.19[3] and support SinkV2 
implementation, I think we can discuss in the community whether to directly 
bump to 1.19 based on this PR.
3. For Kudu, It was forked from Apache Bahir project after its retirement[4], 
The latest Bahir release was for Flink 1.14[5], and there are no new release 
packages after that, considering that several versions are already missing, I 
think it is acceptable to directly release the target 1.19/1.20 version.


[1] 
https://github.com/apache/flink-connector-hbase/pull/46#issuecomment-2429102827
[2] https://lists.apache.org/thread/rsdrwv2vz2y83lco9wkxbx7l9rcbqxpv
[3] 
[4] 
https://cwiki.apache.org/confluence/display/FLINK/FLIP-439%3A+Externalize+Kudu+Connector+from+Bahir
[5] https://github.com/apache/bahir-flink/releases/tag/v1.1.0 

> 2024年10月21日 11:24,Hang Ruan <ruanhang1...@gmail.com> 写道:
> 
> Hi, Yanquan.
> 
> Thanks for this discussion.
> 
> Flink 1.20 has been released for two months and only kafka connector
> supports this version by now. It is reasonable to create an issue to track
> this content.
> For connectors whose latest supported version are 1.17(ES & HBase &
> RabbitMQ), I am considering whether it is fine to skip the support for 1.18
> and just for 1.19 & 1.20.
> 
> Best,
> Hang
> 
> 
> Yanquan Lv <decq12y...@gmail.com> 于2024年10月15日周二 13:41写道:
> 
>> Dear Flink devs, I would like to initiate a discussion about upgrading the
>> Flink External connector Flink bump version to Flink 1.20.
>> Apache Flink 1.20 is expected to be the last 1.x release and a Flink
>> long-term support (LTS) version[1], For new users and those planning to
>> upgrade to the Flink version, this will be their preferred version to use.
>> Since we only promise compatibility with last 2 minor Flink versions[2],
>> The minimum version we need to rely on is Flink 1.19, however, most of our
>> current external connectors have not been upgraded to this version yet.
>> I would like to create a parent jira to track the support tasks for Flink
>> 1.20 like FLINK-35131[3], including Kafka's bump plan[4], Elasticsearch's
>> bump plan[5], and so on.
>> For connectors that currently rely on Flink 1.16, considering that their
>> iterations are not frequent, I tend to provide multiple versions during the
>> release, such as the 1.18, 1.19, and 1.20 release packages, to reduce the
>> number of releases.
>> I'm looking forward to receiving some feedback and opinions from the
>> community, Due to the potentially large workload here, I'd greatly
>> appreciate any volunteers for RM.
>> 
>> The following is a list of Flink versions that external connector
>> currently depends on directly:
>> Flink1.16: flink-connector-hbase/flink-connector-rabbitmq
>> Flink1.17: flink-connector-kudu
>> Flink1.18:
>> flink-connector-pulsar/flink-connector-jdbc/flink-connector-mongodb/flink-connector-elasticsearch/flink-connector-cassandra/flink-connector-opensearch/flink-connector-hive
>> Flink1.19:
>> flink-connector-kafka/flink-connector-aws/flink-connector-prometheus/flink-connector-gcp-pubsub
>> 
>> [1]
>> https://cwiki.apache.org/confluence/display/FLINK/FLIP-458%3A+Long-Term+Support+for+the+Final+Release+of+Apache+Flink+1.x+Line
>> [2]
>> https://cwiki.apache.org/confluence/display/FLINK/Externalized+Connector+development
>> [3]https://issues.apache.org/jira/browse/FLINK-35131
>> [4]discuss thread:
>> https://lists.apache.org/thread/rl7prqop7wfn2o8j2j9fd96dgr1bjjnx
>> [5]discuss thread:
>> https://lists.apache.org/thread/w7h59ddh8ky2fc65wlj6k3gxgshg4js5

Reply via email to