Which hadoop release are you using ?
Can you pastebin more of the server logs ?

bq. load file larger than 20M

Do you store such file(s) directly on hdfs and put its path in hbase ?
See HBASE-11339 HBase MOB

On Tue, Sep 16, 2014 at 7:29 PM, QiXiangming <qixiangm...@hotmail.com>
wrote:

> hello ,everyone
>         i use hbase to store small pic or files , and meet an exception
> raised from hdfs, as following :
>
>  slave2:50010:DataXceiver error processing WRITE_BLOCK operation  src: /
> 192.168.20.246:33162 dest: /192.168.20.247:50010
> java.io.IOException: Premature EOF from inputStream
> at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:194)
>
> at 
> org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:213)
>
> at 
> org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134)
>
> at 
> org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109)
>
> at 
> org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:446)
>
> at 
> org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:702)
>
> at 
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:711)
>
> at 
> org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:124)
>
> at 
> org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:71)
>
> at 
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:229)
> at java.lang.Thread.run(Thread.java:745)
>
> when hbase stores pics or file  under 200k, it works well,
> but if you load file larger than 20M , hbase definitely down!
>
> what's wrong with it ?
> can anyone help use?
>
>
> URGENT!!!
>
>
> Qi Xiangming
>

Reply via email to