I know that this reply is quite late. I'm not aware of any Flume Parquet
writer that currently exists. If it was me I would stream it to HDFS in
Avro format and then use an ETL job (perhaps via Spark or Impala) to
convert the Avro to Parquet in large batches. Parquet is well suited to
large batches of records due to its columnar nature.

Mike

On Sun, Jul 16, 2017 at 11:24 PM, Kumar, Ashok 6. (Nokia - IN/Bangalore) <
ashok.6.ku...@nokia.com> wrote:

> Hi all ,
>
>
>
> I have avro data coming from kafka and I want to convert it into Parquet
> using flume. I am not sure how to do it. Can anyone help me out in this.
>
>
>
> Regards ,
>
> Ashok
>

Reply via email to