Hi Kidong, That's pretty cool! I'm curious what this offers over the Confluent HDFS connector <https://github.com/mykidong/kafka-etl-consumer>, though.
The README mentions not depending on the Schema Registry, and that the schema can be retrieved via the classpath and Consul. This functionality should actually be pluggable with Connect by implementing a custom `Converter`, e.g. the SR comes with AvroConverter which acts as the glue. Converter classes can be specified with the `key.converter` and `value.converter` configs. Best, Shikhar On Mon, Aug 1, 2016 at 1:56 AM Kidong Lee <mykid...@gmail.com> wrote: > Hi, > > I have written a simple Kafka ETL which consumes avro encoded data from > Kafka and save them to Parquet on HDFS: > https://github.com/mykidong/kafka-etl-consumer > > It is implemented with Kafka Consumer API and Parquet Writer API. > > - Kidong Lee. >