Hi All, We have a streaming computation that required to validate the data stream against the model provided by the user.
Right now what I have done is to load the model into flink operator and then validate against it. However the model can be updated and changed frequently. Fortunately we always publish this event to RabbitMQ. I think we can 1. Create RabbitMq listener for model changed event from inside the operator, then update the model if event arrived. But i think this will create race condition if not handle correctly and it seems odd to keep this 2. We can move the model into external in external memory cache storage and keep the model up to date using flink. So the operator will retrieve that from memory cache 3. Create two stream and using co operator for managing the shared state. What is your suggestion on keeping the state up to date from external event ? Is there some kind of best practice for maintaining model up to date on streaming operator ? Thanks a lot Cheers -- Welly Tambunan Triplelands http://weltam.wordpress.com http://www.triplelands.com <http://www.triplelands.com/blog/>