This is something we have been needing for a while too. We are restarting the 
streaming context to handle new topic subscriptions & unsubscriptions which 
affects latency of update handling. I think this is something that needs to be 
addressed in core Spark Streaming (I can't think of any fundamental limitations 
that prevent this, perhaps nobody has just expressed interest in this feature 
so far?).

From: yael.aharo...@gmail.com At: Aug 27 2015 10:19:33
To: user@spark.apache.org
Subject: Re:Adding Kafka topics to a running streaming context

Hello,
My streaming application needs to allow consuming new Kafka topics at arbitrary 
times. I know I can stop and start the streaming context when I need to 
introduce a new stream, but that seems quite disruptive. I am wondering if 
other people have this situation and if there is a more elegant solution?
thanks, Yael

Reply via email to