Hi, I have a question basically on how it would be the best way to implement something within Kafka Streams. The thing I would like to do: "dynamically update the subscription pattern of the source topics.
The reasoning behind this (in my project): meta data about the source topics is evented on an other kafka topic, that should be tracked by the kafka streams topology, and depending on that meta data specific source topics should be added, or removed from the kafka streams topology. Currently I track the "meta data topic" as "global state", so that every processor can actually access it to fetch the meta data (this meta data for instance also describes whether or not a specific topic pattern should be tracked by the stream processor) - so consider this as some kind of "configuration" stream about the source topics. So now it comes, Is there any way I could (from a running topology) update the kafka consumer subscriptions? So that I'm able to replace the source topic pattern while the topology is running? I don't think there currently is a way to do this, but as under the hood it is just a kafka consumer, my believe is that it should be possible somehow ... I was thinking about the PartitionAssigner ... if I could get my hands on that one, maybe I could dynamically configure it to only allow specific topic-patterns? Or directly alter the subscription on the underlying consumer? I don't know all the nifty details about the Kafka Streams internals, so it would be nice if someone could direct me in the right direction to achieve this ... Thanks, Bart