Re: Conversion expects insert-only records but DataStream API record contains: UPDATE_BEFORE

2023-08-13 Thread liu ron
Hi, After deep dive into the source code, I guess you use the StreamTableEnvironment#fromDataStream method, this method only supports the insert-only message. According to your case, I think you should use the StreamTableEnvironment#fromChangelogStream[1], it supports consuming update row. [1] ht

Question about serialization of java.util classes

2023-08-13 Thread s
Greetings, I am working on a project that needs to process around 100k events per second and I'm trying to improve performance. Most of the classes being used are POJOs but have a couple of fields using a `java.util` class, either `ArrayList`, `HashSet` or `SortedSet` etc. This forces Flink to

ETL real-time features using Flink with application-level metrics

2023-08-13 Thread Dong Lin
Hi all, I am writing this email to promote our open-source feature store project ( FeatHub ) that supports using Flink (production-ready) and Spark (not production-ready) to compute real-time / offline features with pythonic declarative feature specifications.

FeatHub : a feature store for ETL real-time features using Flink

2023-08-13 Thread Dong Lin
Dong Lin 于2023年8月14日 周一09:02写道: > Hi all, > > I am writing this email to promote our open-source feature store project ( > FeatHub ) that supports using Flink > (production-ready) and Spark (not production-ready) to compute real-time / > offline features with py

Re: Question about serialization of java.util classes

2023-08-13 Thread liu ron
Hi, According to the test in [1], I think Flink can recognize Pojo class which contains java List, so I think you can refer to the related Pojo class implementation. [1] https://github.com/apache/flink/blob/fc2b5d8f53a41695117f6eaf4c798cc183cf1e36/flink-core/src/test/java/org/apache/flink/api/jav

Re: Flink throws exception when submitting a job through Jenkins and Spinnaker

2023-08-13 Thread Shammon FY
Hi, It seems that the client can not access the right network to submit you job, maybe the address option in k8s is wrong and you can check the error message in k8s log Best, Shammon FY On Fri, Aug 11, 2023 at 11:40 PM elakiya udhayanan wrote: > > Hi Team, > We are using Apache Flink 1.16.1 co