Sounds like your describing a source function that subscribes to couch db updates. You'd usually implement this as a Co(Flat)MapFunction that has 2 inputs, 1 from kafka and one from couch db, which stores the processing parameters in state.

There's no built-in way to subscribe to couch db updates, so you'll have to find a client to do so yourself.

On 12/04/2019 12:41, Soheil Pourbafrani wrote:
Hi,

In my problem I should Process Kafka messages Using Apache Flink, while some processing parameters should be read from the CouchDB, So I have two questions:

1- What is Flink way to read data from the CouchDB?
2- I want to trigger Flink to load data from the Couch DB if a new document was inserted into the table, instead of reading data from the database in every message processing. What should be my strategy in such an environment? I think I need some trigger or event handler? Does Flink provide any facilities for this?

Thanks


Reply via email to