Hi abdelali,
If you can’t get your producers to send the different types of events to 
different topics (or you don’t want to) you could use Kafka streams to filter 
the data in the topic to new topics that are subsets of the data. 

I have also seen apache spark used to do similar.
Thanks,
Jamie 


Sent from the all-new AOL app for iOS


On Monday, June 13, 2022, 4:53 pm, abdelali elmantagui 
<abdelalielmanta...@gmail.com> wrote:

Hi All,

I started a couple of weeks ago learning Kafka, and my goal is to optimize an 
existing architecture that uses Kafka in its components.
The problem is that there many microservices that produce messages/events to 
the the kafka topic and in the other hand theres other microservices that 
consumes these messages/events and each microservice have to consume all the 
messages and then filter which message are interested in and that create a 
problem of huge memory usage because of the huge anount of objects created in 
the memory after deserilaization of these messages.

Am asking for any concept or solution that can help in this situation.

Kind Regards,
Abdelali

+--------------------+
+-----------------+ | | +------------------+
| microservices |---------->| Kafka topic | --------> | microservices |
+-----------------+ | | +------------------+
+--------------------+




Reply via email to