Do you mean Camus didn't write any data to HDFS or you can't read the data on HDFS? SimpleConsumer works with compressed data. You just need to make sure that the fetch size is larger than the compressed set of messages. Also, have you asked the Camus mailing list?
Thanks, Jun On Wed, Mar 19, 2014 at 7:24 AM, Zhu Wayne <zhuw.chic...@gmail.com> wrote: > I am using Camus to get compressed Snappy messages into HDFS. However, I > can't get records in HDFS even though CamusJob was completed successfully. > Getting uncompressed AVRO messages into HDFS is fine. > > According to Neha's post, the consumer should be > * agnostic to compression.* > > http://geekmantra.wordpress.com/2013/03/28/compression-in-kafka-gzip-or-snappy/ > > > Kafka console consumer can handle compression w/o any issue. Camus uses > SimpleConsumer. I am not sure if it can handle compression correctly. > Again, thanks for the help in advance. > > Best, > > Wayne Zhu >