Hi Jingsong Lee,
Thanks for the details. Were you able to achieve end-to-end exactly once
support with Mongo?
Also, for doing any intermittent reads from Mongo (Kafka -> process event ->
lookup Mongo -> enhance event -> Sink to Mongo), I am thinking of using Async
IO
(https://ci.apache.org/pr
Hi vijay:
I developed an append stream sink for Mongo internally, which writes data in
batches according to configureable batch size, and also provides asynchronous
flush. But only insert not update or upsert. It's a good job. It's been working
very well for a long time. (Throughput depends mai
Hello,
Do we know how much of support we have for Mongo? The documentation page is
pointing to a connector repo that was very old (last updated 5 years ago) and
looks like that was just a sample code to showcase the integration.
https://ci.apache.org/projects/flink/flink-docs-stable/dev/batch/con