The current kafka stream implementation assumes the set of topics doesn't change during operation.
You could either take a crack at writing a subclass that does what you need; stop/start; or if your batch duration isn't too small, you could run it as a series of RDDs (using the existing KafkaUtils.createRDD) where the set of topics is determined before each rdd. On Thu, Aug 13, 2015 at 4:38 AM, Nisrina Luthfiyati < nisrina.luthfiy...@gmail.com> wrote: > Hi all, > > I want to write a Spark Streaming program that listens to Kafka for a list > of topics. > The list of topics that I want to consume is stored in a DB and might > change dynamically. I plan to periodically refresh this list of topics in > the Spark Streaming app. > > My question is is it possible to add/remove a Kafka topic that is consumed > by a stream, or probably create a new stream at runtime? > Would I need to stop/start the program or is there any other way to do > this? > > Thanks! > Nisrina >