Hello everyone,

I am a Flink newcomer and I would like to implement a Flink application with 
two Kafka sources: one for the data stream to be processed and the other one 
for control purposes. The application should be able to read from the control 
stream and then apply the control operation to the data coming from the data 
stream. To be more clear, I would like to have something like: if the 
application reads from the control source a control operation with identifier 
22, then it should apply a certain transformation to all the incoming data 
values that are marked with id 22. 

I would like to ask you if having two Kafka sources (one for the data and 
another for control purposes) is actually a good practice. I’d like also to ask 
you if you have some advices or suggestions for me regarding how to keep a 
queue of such active control operations.

Thank you so much. 

Best,


Gabriele

Reply via email to