Hi Yanquan,
Thanks for your support. Yes and support for the new dialects.

It would be great to get open PRs merged to support new dialect if possible, 
including:
https://github.com/apache/flink-connector-jdbc/pull/118 - this has been around 
for a while with the submitter asking for a merge ?C the CI tests are currently 
failing ?C but if you can reassure him you could merge after a rebase that 
would be fabulous.

and
https://github.com/apache/flink-connector-jdbc/pull/149
open lineage support would be great so we have more complete support at Flink 
1.20 and 2.

The next step is to agree a committer to be the release manager; are you 
interested in doing this?
        Kind regards, David.


From: Yanquan Lv <decq12y...@gmail.com>
Date: Monday, 2 December 2024 at 01:57
To: dev@flink.apache.org <dev@flink.apache.org>
Subject: [EXTERNAL] Re: Plans for JDBC connector for 1.20?
Hi, David.

+1 to cut main into a new 1.20 compatible release as flink-connector-jdbc for 
1.20 is a demand frequently mentioned by users and is also prioritized for 
adaptation to Flink 2.0.

I??ve checked the issues[1] that were solved in 3.3, most of them are 
introduced by FLIP-377: Support fine-grained configuration to control filter 
push down for Table/SQL Sources and FLIP-449: Reorganization of 
flink-connector-jdbc, and only two bugfixs[4] that related jdbc-3.2.0 product 
codes, which I don't think is a blocking that must create a release.

[1]https://issues.apache.org/jira/browse/FLINK-33460?jql=project%20%3D%20FLINK%20AND%20status%20%3D%20Resolved%20AND%20fixVersion%20%3D%20jdbc-3.3.0
[2]https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=276105768
[3]https://cwiki.apache.org/confluence/display/FLINK/FLIP-449%3A+Reorganization+of+flink-connector-jdbc
[4]https://issues.apache.org/jira/browse/FLINK-35542?jql=project%20%3D%20FLINK%20AND%20issuetype%20%3D%20Bug%20AND%20status%20%3D%20Resolved%20AND%20fixVersion%20%3D%20jdbc-3.3.0





> 2024??11??22?? 01:15??David Radley <david_rad...@uk.ibm.com> ??????
>
> Hi,
> Is there a plan to release the JDBC for Flink 1.20? Apologies if this is 
> already in hand, I could not find anything:
>
>  *   I notice that the last JDBC connector content corresponding to Flink 
> 1.19 contained minimal content.
>  *   The main branch ?C now has a pom with 1.20 and contains a massive amount 
> of content compared to 1.19, which has minimal cherry picked fixes in.
>  *   So the worry is that we have a large amount of unshipped code in main, 
> amplified by the fact that there is introduces support for many new databases.
>  *   The latest code appears to have open lineage support with 
> https://github.com/apache/flink-connector-jdbc/pull/137 so I assume the main 
> code is now not compatible with Flink 1.19
>
>
> Thinking about how to move this forward. Some options:
>
>  1.  I am wondering is there an appetite to cut main into a new 1.20 
> compatible release pretty much as-is
>  2.  We could do a 1.19 and 1.20 compatible release like the Kafka connector 
> 3.3 and a 1.20 open lineage release like the Kafka connector 3.4, with 
> minimal critical backported fixes.
>
> I am keen to do option 1), as the main code will otherwise continue to 
> diverse from what we release. I wonder what manual per database testing would 
> be required; to reassure us all the new database support actually works. Or 
> is the existing automated testing enough?
>
> If we get some consensus on approach, I can help drive this, as this is going 
> to be an inhibitor for many of us in the community to migrate to Flink 1.20 
> and subsequently to Flink v2.
>
> Kind regards,      David.
>
> Unless otherwise stated above:
>
> IBM United Kingdom Limited
> Registered in England and Wales with number 741598
> Registered office: Building C, IBM Hursley Office, Hursley Park Road, 
> Winchester, Hampshire SO21 2JN

Unless otherwise stated above:

IBM United Kingdom Limited
Registered in England and Wales with number 741598
Registered office: Building C, IBM Hursley Office, Hursley Park Road, 
Winchester, Hampshire SO21 2JN

Reply via email to