The current converters want you to send Avro records with a "schema id" 
prepended to the serialized Avro.  You also need the schema registry running.  
I'm guessing this is what Olivier is talking about.

 I think it is possible to write your own converter that doesn't need this but 
I haven't tried.

-Dave

-----Original Message-----
From: Michael Sklyar [mailto:mikesk...@gmail.com]
Sent: Monday, September 26, 2016 6:11 AM
To: users@kafka.apache.org
Subject: Re: Kafka connect 2.0.1 - ByteArrayConverter ?

Not sure what you are trying to do,
Insert data to Kafka? Get data from Kafka?

What about the JsonConverter?


On Fri, Sep 23, 2016 at 4:13 PM, Olivier Girardot < 
o.girar...@lateral-thoughts.com> wrote:

> Hi everyone,is there any way to use a straightforward converter
> instead of the AvroConverter for avro data, because the
> NullPointerException
> (https://github.com/confluentinc/kafka-connect-hdfs/issues/36 and
> https://github.com/confluentinc/schema-registry/issues/272)  is quite
> blocking and an upgrade of the cluster is not yet possible.
> I guess such a converter can be developped but I was wondering if any
> workaround is available out of the box ?
> Regards,
> Olivier Girardot
This e-mail and any files transmitted with it are confidential, may contain 
sensitive information, and are intended solely for the use of the individual or 
entity to whom they are addressed. If you have received this e-mail in error, 
please notify the sender by reply e-mail immediately and destroy all copies of 
the e-mail and any attachments.

Reply via email to