Hi Ricardo,

IIUC, your response assumes I'm talking about user events (such as ad
clicks or so) that get published on a topic by some external process. That
wasn't what I meant, though. What I want to track is actual Kafka broker
events, such as a consumer group subscribing to a topic.
Basically, we have an internally-exposed Kafka with a bunch of topics. I'm
interested in who's connecting and consuming from it. We control the
authorisation (having injected our own class there), so I could do
something appropriate there, but was wondering if there is any built-in or
OSS support, as I'm a bit hesitant to put complexity (and potential for
failure) into the authoriser.

Something like Burrow, for example, which I think we could use to track
which consumer groups exist, but doesn't help us if consumers don't commit
(which they often don't, in our case - and then we don't know what topics
they are consuming).

Even so - latency isn't an issue. The question is more along the lines of
"who accessed this topic over the last 7 days".

Cheers,
-Joris.

On Tue, Jun 23, 2020 at 2:50 PM Ricardo Ferreira <rifer...@riferrei.com>
wrote:

> Joris,
>
> I think the best strategy here depends on how fast you want to get access
> to the user events. If latency is a thing then just read de data from the
> topic along with the other applications. Kafka follows the
> *write-once-read-many-times* pattern which encourage developers to reuse
> the data in the topics for different purposes without necessarily creating
> a performance penalty within the applications. Just make sure to put this
> observer into its own consumer group and you will have a copy of the same
> data delivered to the other applications. If you don't want to write your
> own application just to read the events then you can use Kafka Connect and
> some specialized sink connector to read the data for you and write wherever
> you want.
>
> If latency is not a issue then try MirrorMaker: it can replicate Kafka
> topics to another Kafka cluster that could be used for auditing purposes.
> Not necessary I must say (I would go for the solution above) but certainly
> possible.
>
> Thanks,
>
> -- Ricardo
> On 6/23/20 7:41 AM, Joris Peeters wrote:
>
> Hello,
>
> For auditing and tracking purposes, I'd like to be able to monitor user
> consumer events like topic subscriptions etc. The idea is to have the
> individual events, not some number/second aggregation.
>
> We are using the confluent-docker kafka image, for 5.2.2 (with some bespoke
> auth injected), which I believe is Kafka 2.2.
>
> What are some possible strategies that people have used for this?
>
> Thanks,
> -Joris.
>
>
>

Reply via email to