Hello,
I have a need to process events in near real-time that are generated from 
various upstream sources and are currently stored in Kafka. I want to build a 
pipeline that reads the data as a continuous stream, enrich the events and 
finally store it in both ClickHouse and Kafka sinks. 
To get a better ingest performance at the sink (click house), I have to buffer 
the data and do a bulk insert. I also have a need to transactionally write to 
Kafka as well (i.e.,) say if I buffer 1000 events (enriched data), I should be 
able to bulk ingest to both ClickHouse and Kafka. Partial writes to any one of 
the sink is not acceptable. ClickHouse does not support full fledged 
transactions yet.
Given those above constraints, could you please suggest options that I could 
explore.
ThanksVijay



Reply via email to