RE: Kafka Connect key.converter and value.converter properties for Avro encoding

2016-11-08 Thread david.franklin
in compacted topics for deletion, it is important to actually translate null values in Connect to be true nulls in Kafka. > > Thanks again, > David > > > > > > > > -Original Message- > From: Ewen Cheslack-Postava [mailto:e...@confluent.io] > Se

Re: Kafka Connect key.converter and value.converter properties for Avro encoding

2016-11-07 Thread Ewen Cheslack-Postava
rue nulls in Kafka. > > Thanks again, > David > > > > > > > > -Original Message----- > From: Ewen Cheslack-Postava [mailto:e...@confluent.io] > Sent: 07 November 2016 04:35 > To: dev@kafka.apache.org > Subject: Re: Kafka Connect key.converter and val

RE: Kafka Connect key.converter and value.converter properties for Avro encoding

2016-11-07 Thread david.franklin
vember 2016 04:35 To: dev@kafka.apache.org Subject: Re: Kafka Connect key.converter and value.converter properties for Avro encoding You won't be accepting/returning SpecificRecords directly when working with Connect's API. Connect intentionally uses an interface that is different from

Re: Kafka Connect key.converter and value.converter properties for Avro encoding

2016-11-06 Thread Ewen Cheslack-Postava
ly be achieved via a corresponding > SpecificDatumReader. > > Does this look a reasonable approach? > > Many thanks if you've read this far! > > Regards, > David > > > -Original Message- > From: Gwen Shapira [mailto:g...@confluent.io] > Sent: 02 November 2

RE: Kafka Connect key.converter and value.converter properties for Avro encoding

2016-11-03 Thread david.franklin
ect key.converter and value.converter properties for Avro encoding Both the Confluent Avro Converter and the Confluent Avro Serializer use the Schema Registry. The reason is, as Tommy Becker mentioned below, to avoid storing the entire schema in each record (which the JSON serializer in Apache Kafka do

Re: Kafka Connect key.converter and value.converter properties for Avro encoding

2016-11-02 Thread Gwen Shapira
Both the Confluent Avro Converter and the Confluent Avro Serializer use the Schema Registry. The reason is, as Tommy Becker mentioned below, to avoid storing the entire schema in each record (which the JSON serializer in Apache Kafka does). It has few other benefits schema validation and such. If

Re: Kafka Connect key.converter and value.converter properties for Avro encoding

2016-11-02 Thread Tommy Becker
Although I can't speak to details of the Confluent packaging, anytime you're using Avro you need the schemas for the records you're working with. In an Avro data file the schema is included in the file itself. But when you're encoding individual records like in Kafka, most people instead encode

Kafka Connect key.converter and value.converter properties for Avro encoding

2016-11-02 Thread david.franklin
I am using Kafka Connect in source mode i.e. using it to send events to Kafka topics. With the key.converter and value.converter properties set to org.apache.kafka.connect.storage.StringConverter I can attach a consumer to the topics and see the events in a readable form. This is helpful and r