Snappy Compression for Checkpoints

2023-01-05 Thread Prasanna kumar
Hello Flink Community , We are running Jobs in flink version 1.12.7 which reads from Kafka , apply some rules(stored in broadcast state) and then writes to kafka. This is a very low latency and high throughput and we have set up at least one semantics. Checkpoint Configuration Used 1. We

Re: flink add multiple sink in sequence

2023-01-05 Thread Shammon FY
Hi @Great I think the two sinks in your example are equivalent and independent. If there are some logical relationships between two sinks, you may need to create a new combined sink and do it yourself. On Thu, Jan 5, 2023 at 11:48 PM Great Info wrote: > > I have a stream from Kafka, after readi

Flink Job Manager Recovery from EKS Node Terminations

2023-01-05 Thread Vijay Jammi
Hi, Have a query on the Job Manager HA for flink 1.15. We currently run a standalone flink cluster with a single JobManager and multiple TaskManagers, deployed on top of a kubernetes cluster (EKS cluster) in application mode (reactive mode). The Task Managers are deployed as a ReplicaSet and the

flink add multiple sink in sequence

2023-01-05 Thread Great Info
I have a stream from Kafka, after reading it and doing some transformations/enrichment I need to store the final data stream in the database and publish it to Kafka so I am planning to add two sinks like below *finalInputStream.addSink(dataBaseSink); // Sink1finalInputStream.addSink( flinkKafkaPr

Re: Sink to database with multiple statements in a transaction, and with retry

2023-01-05 Thread Yoni Gibbs
Also, it seems that sinks have to be synchronous, I think? (Unless there's some async equivalent of SinkFunction​?) Assuming they're synchronous, if I do the retry strategy manually in the implementation of SinkFunction.invoke​ that means I'll be blocking that thread while waiting to do a retry

Sink to database with multiple statements in a transaction, and with retry

2023-01-05 Thread Yoni Gibbs
I want to sink some data to a database, but the data needs to go into multiple tables, in a single transaction. Am I right in saying that I cannot use the JDBC Connector for this as it only handles single SQL statements? Assuming that's right, I believe that I need to write a custom sink, so I n