Hi Thangaram Offsets are stored in __consumer_offsets topic. So having a lot of offset to store (around 100k in your case), will require access and write to this topic. Usually it is totally fine, especially when the commit on consumer side is not too frequent. Otherwise in some situation, it could be better to "store" consumer offsets outside Kafka. >From you description, I assume you want to send data through http ? Then you can maybe keep the last consumed offset as a cookie, or as query string parameter you expose to your consumer API
Maybe you can provide additional detail on your use case ? Best regards On Mon, Apr 22, 2019 at 6:19 PM Thangaram Senthamaraikannan < thangaram...@gmail.com> wrote: > Hi, > > I have some doubt regarding using kafka consumer API in kafka > version 0.10.0.1. > > Consider I am having a three node kafka cluster in which there are 1000 > topics with single partition. Each topic will be consumed by multiple > consumer groups, say 100 in parallel. Therefore totally there can be > 1000*100 consumer groups consuming from kafka in parallel. > > My concern is whether this would have any performance impact in kafka > cluster on a larger scale ? > > Also, my model will be starting consumer for each user request and will > fetch single data from queue and consumer will be closed on request end. > For the next request from same user, consumer will be started for same > group again and it will be closed. This will be repeated for all the users. > > Whether creating and tearing down consumer in such a fashion is a proper > one? If not, is there any other proper way to handle this use case? > > Regards, > Thangaram Senthamaraikannan >