Hi Toni,

Couple of thoughts.

1. Kafka behaviour need not be changed at run time. Your producers which
push your MAC data into kafka should know to which topic it should write.
Your producer can be flume, log stash or it can  be your own custom written
java producer.

As long as your producer know which topic to write, they can keep creating
new topics as new MAC data comes through your pipeline.

On Wed, Jan 28, 2015 at 12:10 PM, Toni Cebrián <toni.cebr...@gmail.com>
wrote:

> Hi,
>
>     I'm starting to weight different alternatives for data ingestion and
> I'd like to know whether Kafka meets the problem I have.
>     Say we have a set of devices each with its own MAC and then we receive
> data in Kafka. There is a dictionary defined elsewhere that says each MAC
> to which topic must publish. So I have basically 2 questions:
> New MACs keep comming and the dictionary must be updated accordingly. How
> could I change this Kafka behaviour during runtime?
> A problem for the future. Say that dictionaries are so big that they don't
> fit in memory. Are there any patterns for bookkeeping internal data
> structures and how route to them?
>
> T.
>

Reply via email to