Hi, I use hadoop cloudera 0.20.2-cdh3u0.
I have a program which uploads local files to HDFS every hour. Basically, I open a gzip input stream by in= new GZIPInputStream(fin); And write to HDFS file. After less than two days, it will hang. I hangs at FSDataOutputStream.close(86). Here is the stack: State: WAITING Running 16660 ms (user 13770 ms) blocked 11276 times for <> ms waiting 11209 times for <> ms LockName: java.util.LinkedList@f1ca0de LockOwnerId: -1 java.lang.Object.wait(-2) java.lang.Object.wait(485) org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.waitForAckedSeqno(3468) org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.flushInternal(3457) org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.closeInternal(3549) org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.close(3488) org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(61) org.apache.hadoop.fs.FSDataOutputStream.close(86) org.apache.hadoop.io.IOUtils.copyBytes(59) org.apache.hadoop.io.IOUtils.copyBytes(74) com.turn.platform.datahub.mapreduce2.scheduler.DatahubDownloadJob.uploadFiles(171) Any suggestion to avoid this issue? Thanks, Mingxi