Hi Ben, Have you checked the "branch" function of KStream, and see if that fits your needs? It requires all the generated streams share the same data types, but it seems to be the case for your scenario.
Guozhang On Sat, Apr 16, 2016 at 7:31 AM, Benjamin Manns <benma...@gmail.com> wrote: > Hi all, > > I'm taking a look at the new Kafka Streams system in 0.10, and I was > wondering if anyone has an example or knows how I might demultiplex a topic > with Kafka Streams. It looks pretty easy to join streams together (which is > great), but I only see ways to produce a single (or a predetermined number > of) topics. My use case is that I have a producer that generates messages > like > > { name: 'foo', data: 1 } > { name: 'bar', data: 2 } > { name: 'foo', data: 4 } > > And I want to produce streams > > mystream-foo > > { name: 'foo', data: 1 } > { name: 'foo', data: 4 } > > mystream-bar > { name: 'bar', data: 2 } > > Where the cardinality of name is between 100 and 1000 and will change over > time. (Specifically, I want to split change data capture from Maxwell's > Daemon <http://maxwells-daemon.io/> or similar to a topic-per-table.) > > Is there a way to do this? My thinking is that it will be far more > performant to consume per-table for stream/KTable joins than to filter from > the firehose of every single change in the database. > > Thanks, > > Ben > > -- > Benjamin Manns > benma...@gmail.com > (434) 321-8324 > -- -- Guozhang