Hi,
I'm working with compressions in Hadoop,idea is to compress the file in
Linux then to process the same in Hadoop and to decompress it to Linux. I'm
able to do this for gzip and bzip2 *but when i tried for default
compression, I'm able to compress the file but it didn't work for decompress
i.e, got the file in some unreadable format.*so please clarify me as what
went wrong.

To compress      : compress -c file.txt > file.txt.Z
To Decompress  : time decompress test.Z ( test.Z is the output file got
after processing from Hadoop )



Thanks!

Reply via email to