[ https://issues.apache.org/jira/browse/FLINK-37097?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17912304#comment-17912304 ]
Sergey Nuyanzin commented on FLINK-37097: ----------------------------------------- I guess with current conditions merging to flink repo should be ok because it still goes through working ci. Regarding external connector I submitted a PR to support 1.20 (with ported commits) https://github.com/apache/flink-connector-hive/actions/runs/12737879366 Here there are some issues/questions before moving forward. 1. Probably since 2.0 is not released yet, it would make sense to make a release against Flink 1.20. 2. Since 2.0 drops java 1.8 it might be an issue since Hive 3.x doesn't support jdk11 https://issues.apache.org/jira/browse/HIVE-22415 Moreover this failure is one of the failures if we try to execute end to end tests for hive3 with jdk11 https://issues.apache.org/jira/browse/HIVE-22097 3. Based on ci in Flink main repo: this tests were never tested against java 11 and higher (only java8), after dropping support of jdk1.8 it seems tests now just skipped... 4. Hive 4.x has some breaking changes in interfaces, so it might be a separate activity to support it > Remove Hive connector from core Flink > ------------------------------------- > > Key: FLINK-37097 > URL: https://issues.apache.org/jira/browse/FLINK-37097 > Project: Flink > Issue Type: Technical Debt > Components: Connectors / Hive > Affects Versions: 2.0.0 > Reporter: david radley > Assignee: david radley > Priority: Minor > > As per [https://github.com/apache/flink/pull/25947] the Hive code has been > externalized into a new repo. We should remove flink-connector-hive and the > flink-sql-connector-hive-* maven modules from core flink . > > I am happy to make this change if someone can assign me the Jira. -- This message was sent by Atlassian Jira (v8.20.10#820010)