Hi, Which of the flume sources are you trying to use?
Regards, Harish On Mon, Oct 22, 2012 at 11:18 AM, Sadananda Hegde <[email protected]>wrote: > My application servers produce data files that are in compressed format > (gzip). I am planning to use flume ng (1.2.0) to collect those files and > transfer them to hadoop cluster (write to HDFS). Is it possible to read and > transfer them without uncomressing first? My sink would be HDFS and there > are options to compress before writing to HDFS. That would work fine if my > source is uncompressed text file and need to store hdfs file in compressed > format. But in my case, the source itself is compressed. What would be the > best options to handle such cases? > > Thanks for your help. > > Sadu >
