Although I can't speak to details of the Confluent packaging, anytime you're using 
Avro you need the schemas for the records you're working with. In an Avro data 
file the schema is included in the file itself. But when you're encoding 
individual records like in Kafka, most people instead encode some sort of 
identifier/version number/fingerprint in each message that uniquely identifies the 
schema in some sort of external system (i.e. a schema registry). So I'm not sure 
how you would use Avro in Kafka without some sort of schema registry, unless 
you're planning on either using a static topic -> schema mapping or encoding 
the schema in every message.

On 11/02/2016 05:48 AM, david.frank...@bt.com<mailto:david.frank...@bt.com> 
wrote:

I am using Kafka Connect in source mode i.e. using it to send events to Kafka 
topics.

With the key.converter and value.converter properties set to 
org.apache.kafka.connect.storage.StringConverter I can attach a consumer to the 
topics and see the events in a readable form.  This is helpful and reassuring 
but it is not the desired representation for my downstream consumers - these 
require the events to be Avro encoded.

It seems that to write the events to Kafka Avro encoded, these properties need 
to be set to io.confluent.kafka.serializers.KafkaAvroSerializer.  Is this 
correct?

I am not using the Confluent platform, merely the standard Kafka 10 download, 
and have been unable to find out how to get at these from a Maven repository 
jar.  http://docs.confluent.io/3.0.0/app-development.html#java suggest that 
these are available via:

              <dependency>
        <groupId>io.confluent</groupId>
        <artifactId>kafka-avro-serializer</artifactId>
        <version>3.0.0</version>
    </dependency>

But it doesn't appear to be true.  The class exists in 
https://raw.githubusercontent.com/confluentinc/schema-registry/master/avro-converter/src/main/java/io/confluent/connect/avro/AvroConverter.java
 but this seems to use the Schema Registry which is something I'd rather avoid.

I'd be grateful for any pointers on the simplest way of getting Avro encoded 
events written to Kafka from a Kafka Connect source connector/task.

Also in the task which creates SourceRecords, I'm choosing Schema.BYTES_SCHEMA 
for the 4th arg in the constructor.  But I'm not clear what this achieves - 
some light shed on that would also be helpful.

Many thanks,
David



--
[cid:part1.567F4BCD.26FDFD10@tivo.com] Tommy Becker
Senior Software Engineer
O +1 919.460.4747
tivo.com<http://www.tivo.com/>

________________________________

This email and any attachments may contain confidential and privileged material 
for the sole use of the intended recipient. Any review, copying, or 
distribution of this email (or any attachments) by others is prohibited. If you 
are not the intended recipient, please contact the sender immediately and 
permanently delete this email and any attachments. No employee or agent of TiVo 
Inc. is authorized to conclude any binding agreement on behalf of TiVo Inc. by 
email. Binding agreements with TiVo Inc. may only be made by a signed written 
agreement.

Reply via email to