Re: Paimon missing CatalogFactory

2025-02-04 Thread Yanquan Lv
Hi, Dominik. It seems that you only used Paimon's version number, But actually you should include the version number of Flink, like org.apache.paimon:paimon-flink-1.19:1.0.0. You can see the main difference from the content of the two links below: [1] https://mvnrepository.com/artifact/org.apache

Re: Flink OpenSearch connector version supported for Flink 1.20

2025-01-27 Thread Yanquan Lv
Refer to the discussion before, Flink OpenSearch Connector version 1.x is suitable for lists.apache.org

Re: Flink OpenSearch connector version supported for Flink 1.20

2025-01-27 Thread Yanquan Lv
Hi, What about using org.apache.flink:flink-sql-connector-opensearch2:2.0.0-1.19, this version supported Flink 1.20 and Opensearch2. https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-opensearch2/2.0.0-1.19 > 2025年1月28日 13:27,Swati Jain Goyal via user 写道: > > 1.0.1-1.16

Re: Flink CDC -> Kafka -> Paimon?

2024-12-09 Thread Yanquan Lv
gt; > Is this an issue that you've come across? > > For context, this is the link to our setup > <https://github.com/john-mwangi/mariadb-iceberg-pipeline/blob/iceberg-compatibility/dockerfiles/scripts/create_jobs.sql#L165> > on GitHub. > > Regards, > John Mwa

Re: Flink 1.20 Sink API relies on deprecated InitContext

2024-11-11 Thread Yanquan Lv
Hi Adrien, Yes, we recommend using org.apache.flink.api.connector.sink2.Sink instead of org.apache.flink.streaming.api.functions.sink.SinkFunction or org.apache.flink.api.connector.sink.Sink. As for why these codes are still retained in 1.20, one reason is because Sink The InitContext was only

Re: Tenantive Flink 2.0 release date?

2024-10-31 Thread Yanquan Lv
some of the imports still depend on Guava 31 (from the older > Flink shaded JAR i.e jre31). > > Thanks > > On Mon, Oct 28, 2024 at 4:37 AM Yanquan Lv <mailto:decq12y...@gmail.com>> wrote: >> Hi, Anil. >> In Flink 2.0, the Depreciated APIs was removed, wh

Re: Tenantive Flink 2.0 release date?

2024-10-28 Thread Yanquan Lv
Hi, Anil. In Flink 2.0, the Depreciated APIs was removed, which requires connector adaptation. It is expected that most external connectors will complete the migration work in Flink 2.3. The FlinkCDC community hopes to bump this version and JDK 11 after the release of Flink 2.0 for a period of t

Re: Flink CDC -> Kafka -> Paimon?

2024-10-28 Thread Yanquan Lv
e.org/jira/browse/FLINK-36611 > 2024年10月28日 18:22,Yanquan Lv 写道: > > Hi, Andrew. > Yeah, currently, the output from Kafka pipeline didn't contain schema info, > So in Paimon action, it will be considered as a String type. > Your suggestion is very meaningful. I plan to

Re: Flink CDC -> Kafka -> Paimon?

2024-10-28 Thread Yanquan Lv
Hi, Andrew. Yeah, currently, the output from Kafka pipeline didn't contain schema info, So in Paimon action, it will be considered as a String type. Your suggestion is very meaningful. I plan to support this feature in the next version of FlinkCDC (FlinkCDC 3.3), which may be enabled through a pa

Re: Opensearch Connector for Flink 1.18+

2024-10-22 Thread Yanquan Lv
Hi, Kirti. We already have a 2.0.0 version in maven repo[1] for Opensearch connector for Flink 1.18/1.19. But it should be noted that this version is built on JDK11[2]. [1] https://mvnrepository.com/artifact/org.apache.flink/flink-connector-opensearch2 [2] https://lists.apache.org/thread/3w1rnj

Re: Flink custom sink

2024-10-14 Thread Yanquan Lv
teps flow charts for better >>> understanding of the sink process with sink, writer and committer ? >>> >>> Thanks >>> >>> On Mon, Oct 14, 2024 at 9:48 AM Yanquan Lv >> <mailto:decq12y...@gmail.com>> wrote: >>>> Yeah, TwoPhase

Re: Flink custom sink

2024-10-14 Thread Yanquan Lv
es i.e FLIP-143, 171, 177 > and 191. So, Is there a sequence of steps flow charts for better > understanding of the sink process with sink, writer and committer ? > > Thanks > > On Mon, Oct 14, 2024 at 9:48 AM Yanquan Lv wrote: > >> Yeah, TwoPhaseCommittingSink will be rem

Re: Flink custom sink

2024-10-14 Thread Yanquan Lv
le in the flink repo > main branch. it is replaced with Sink, Committer and SinkWritter ? > > Thanks > > On Mon, Oct 14, 2024 at 1:45 AM Yanquan Lv wrote: > >> Hi, Anil. >> >> Iceberg Sink is merged recently in >> https://github.com/apache/iceberg/pull/1017

Re: Flink custom sink

2024-10-14 Thread Yanquan Lv
Hi, Anil. Iceberg Sink is merged recently in https://github.com/apache/iceberg/pull/10179#pullrequestreview-2350414880. From your description, I guess that what you need is a TwoPhaseCommittingSink[1], the steps you listed can be executed with the following steps: > 1. Group data by category

Re: Postgres-CDC start replication fails after stop/start on flink stream

2024-07-03 Thread Yanquan Lv
Hi, David. We've met a similar problem of pg connection, the error message is 'Socket is closed' and we put a lot of effort into investigating, but we couldn't find the reason. Then we modify the publication mode[1] and only subscribe the changes of certain table with following connector options: '