Joris,

I think the best strategy here depends on how fast you want to get access to the user events. If latency is a thing then just read de data from the topic along with the other applications. Kafka follows the /write-once-read-many-times/ pattern which encourage developers to reuse the data in the topics for different purposes without necessarily creating a performance penalty within the applications. Just make sure to put this observer into its own consumer group and you will have a copy of the same data delivered to the other applications. If you don't want to write your own application just to read the events then you can use Kafka Connect and some specialized sink connector to read the data for you and write wherever you want.

If latency is not a issue then try MirrorMaker: it can replicate Kafka topics to another Kafka cluster that could be used for auditing purposes. Not necessary I must say (I would go for the solution above) but certainly possible.

Thanks,

-- Ricardo

On 6/23/20 7:41 AM, Joris Peeters wrote:
Hello,

For auditing and tracking purposes, I'd like to be able to monitor user
consumer events like topic subscriptions etc. The idea is to have the
individual events, not some number/second aggregation.

We are using the confluent-docker kafka image, for 5.2.2 (with some bespoke
auth injected), which I believe is Kafka 2.2.

What are some possible strategies that people have used for this?

Thanks,
-Joris.

Reply via email to