Avro JSON encoding is a wire-level format. The AvroConverter accepts Java
runtime data (e.g. primitive types like Strings & Integers, Maps, Arrays,
and Connect Structs).

The component that most closely matches your needs is Confluent's REST
proxy, which supports the Avro JSON encoding when receiving a payload (via
the vnd.kafka.avro.v1+json format). However, since the REST proxy is push,
you'd need another process to send requests to it.

Alternatively, you could build a source connector which pulls Avro
JSON-encoded data and emits generic Connect data. By combining that with
the AvroConverters, you'd end up generating Avro binary-serialized data
into Kafka (albeit via a slightly indirect route).

-Ewen

On Mon, Oct 17, 2016 at 6:14 AM, Takahiro Hozumi <fat...@googlemail.com>
wrote:

> Hi all,
>
> My application will emit log files in avro json encoding so that humans
> easily can read and grep records.
> I need to transfer this logs into Kafka as Avro binary encoding.
> And I want to use confluent schema registry in order to prepare schema
> evolution.
>
> After some research, I think Kafka connect in standalone mode might be the
> easiest solution.
> The avro-converter included in kafka connect interacts with schema registry
> under the hood.
> But I am not sure avro-converter accepts Avro json encoding as input.
> Do I need to create a json-encoding version of the avro-converter?
>
> And I don't find a documentation of Converter interface.
> I think converting job is just byte[] to byte[] job, but the interface
> seems to require internal data object.
>
> http://docs.confluent.io/3.0.1/connect/javadocs/index.html?
> org/apache/kafka/connect/storage/Converter.html
>
> Any advise or guidance would be greatly appreciated!
>
> Thanks,
> --
> Takahiro
>



-- 
Thanks,
Ewen

Reply via email to