Hi all,

My application will emit log files in avro json encoding so that humans
easily can read and grep records.
I need to transfer this logs into Kafka as Avro binary encoding.
And I want to use confluent schema registry in order to prepare schema
evolution.

After some research, I think Kafka connect in standalone mode might be the
easiest solution.
The avro-converter included in kafka connect interacts with schema registry
under the hood.
But I am not sure avro-converter accepts Avro json encoding as input.
Do I need to create a json-encoding version of the avro-converter?

And I don't find a documentation of Converter interface.
I think converting job is just byte[] to byte[] job, but the interface
seems to require internal data object.

http://docs.confluent.io/3.0.1/connect/javadocs/index.html?org/apache/kafka/connect/storage/Converter.html

Any advise or guidance would be greatly appreciated!

Thanks,
--
Takahiro

Reply via email to