Hi all, I'm taking a look at the new Kafka Streams system in 0.10, and I was wondering if anyone has an example or knows how I might demultiplex a topic with Kafka Streams. It looks pretty easy to join streams together (which is great), but I only see ways to produce a single (or a predetermined number of) topics. My use case is that I have a producer that generates messages like
{ name: 'foo', data: 1 } { name: 'bar', data: 2 } { name: 'foo', data: 4 } And I want to produce streams mystream-foo { name: 'foo', data: 1 } { name: 'foo', data: 4 } mystream-bar { name: 'bar', data: 2 } Where the cardinality of name is between 100 and 1000 and will change over time. (Specifically, I want to split change data capture from Maxwell's Daemon <http://maxwells-daemon.io/> or similar to a topic-per-table.) Is there a way to do this? My thinking is that it will be far more performant to consume per-table for stream/KTable joins than to filter from the firehose of every single change in the database. Thanks, Ben -- Benjamin Manns benma...@gmail.com (434) 321-8324