Hello,

I'm new to Flink and I need some advicees regarding the best approach to do
the following:
- read some items from a Kafka topic
- on Flink stream side, after some simple filtering steps, group these items
in batches by flink processing time.
- insert the items in a PostgreSql database using a batch insert.

I did this by using a time window of 1 second and added a custom sink which
collects items in a blocking queue. Additionally I need to have a separate
thread which triggers the commit to the database after some time, smaller
than window's time.

The solution works, but i am not very pleased with it because it looks very
complicated for a simple batching items task.

Is there any way to trigger the commit directly when the window is closed? I
didn't find any solution to get notified when the window is completed. I
would like to get rid of this separate thread only for triggering the batch
insert.

Any other possible solution would be highly appreciated. :)
Thanks



--
View this message in context: 
http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Kafka-Stream-to-Database-batch-inserts-tp10036.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at 
Nabble.com.

Reply via email to