Hi Team Kafka,

(sorry for the flood, this is last one! promise!)

If you tried out PR-99, you know that CopyCat now does on-going
export/import. So it will continuously read data from a source and write it
to Kafka (or vice versa). This is great for tailing logs and replicating
from MySQL binlog.

But, I'm wondering if there's a need for a batch-mode too.
This can be useful for:
* Camus-like thing. You can stream data to HDFS, but the benefits are
limited and there are some known issues there.
* Dump large parts of an RDBMS at once.

Do you agree that this need exist? or is stream export/import good enough?

Also, anyone has ideas how he would like the batch mode to work?

Gwen

Reply via email to