Hi, We have one application that produces various data ( dataset1, dataset 2 ....) at various sites when our application is executed. Currently dataset1, dataset2 ..... is stored into separate *.json data files for each execution.
We don't want to transfer the data when we are running the application. The reason for it is that application might not be able to connect to database, for various reasons. We want the producer of data to notify consumer ( just a central one if possible) that it has data and where data is located. Then consumer get it. Or is there a better way..... Consumer will then push the data for one execution to the db. Our application can run at various sites. Site 1 Producer1 -------> Consumer1 ---> db Site 2 Producer1 -------> Consumer1 ---> db Site 2 Producer2 -------> Consumer1 ---> db Site 3 Producer1 -------> Consumer1 ---> db ... Site x Producerx --------> Consumer1 ---> db I need user input if someone else out there has used Kafka in this way? Is it recommended? Or is there alternatives? Br //mike