Hadoop devs,

A colleague of mine recently hit a strange issue where zstd compression
codec crashes.

Caused by: java.lang.InternalError: Error (generic)
at
org.apache.hadoop.io.compress.zstd.ZStandardCompressor.deflateBytesDirect(Native
Method)
at
org.apache.hadoop.io.compress.zstd.ZStandardCompressor.compress(ZStandardCompressor.java:216)
at
org.apache.hadoop.io.compress.CompressorStream.compress(CompressorStream.java:81)
at
org.apache.hadoop.io.compress.CompressorStream.write(CompressorStream.java:76)
at
org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:57)
at java.io.DataOutputStream.write(DataOutputStream.java:107)
at
org.apache.tez.runtime.library.common.sort.impl.IFile$Writer.writeKVPair(IFile.java:617)
at
org.apache.tez.runtime.library.common.sort.impl.IFile$Writer.append(IFile.java:480)

Anyone out there hitting the similar problem?

A temporary workaround is to set buffer size "set
io.compression.codec.zstd.buffersize=8192;"

We suspected it's a bug in zstd library, but couldn't verify. Just want to
send this out and see if I can get some luck.

Reply via email to