Hi Ewen,
Thank you for your advise.
I reached a following compromise:
My application sends records in the Avro Binary encoding to Kafka directly
using Confluent's serializers(io.confluent/kafka-avro-serializer), which
integrated with schema registry, and also does logging in the Avro JSON
encoding
Avro JSON encoding is a wire-level format. The AvroConverter accepts Java
runtime data (e.g. primitive types like Strings & Integers, Maps, Arrays,
and Connect Structs).
The component that most closely matches your needs is Confluent's REST
proxy, which supports the Avro JSON encoding when receivi
Hi all,
My application will emit log files in avro json encoding so that humans
easily can read and grep records.
I need to transfer this logs into Kafka as Avro binary encoding.
And I want to use confluent schema registry in order to prepare schema
evolution.
After some research, I think Kafka c