>From first look, the exception came from hdfs layer. What release of hadoop are you using ?
If you can pastebin some more of the region server log, that would give us more clue. Cheers On Wed, Apr 15, 2015 at 6:58 PM, 侯野 <[email protected]> wrote: > hi all: > > I have got a error on my hbase cluster .The version is HBase > 0.98.6.1-hadoop2. > --------------------------------------------------------- > at > > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:475) > 2015-04-15 05:19:04,221 ERROR [Thread-379358] hdfs.DFSClient: Failed to > close file > > /hbase/WALs/dw12,60020,1420093890966/dw12%2C60020%2C1420093890966.1426662392220 > > org.apache.hadoop.ipc.RemoteException(java.lang.ArrayIndexOutOfBoundsException): > java.lang.ArrayIndexOutOfBoundsException > > at org.apache.hadoop.ipc.Client.call(Client.java:1347) > at org.apache.hadoop.ipc.Client.call(Client.java:1300) > at > > org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) > at com.sun.proxy.$Proxy13.getAdditionalDatanode(Unknown Source) > at > > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getAdditionalDatanode(ClientNamenodeProtocolTranslatorPB.java:352) > at sun.reflect.GeneratedMethodAccessor90.invoke(Unknown Source) > at > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at > > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186) > at > > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) > at com.sun.proxy.$Proxy14.getAdditionalDatanode(Unknown Source) > at sun.reflect.GeneratedMethodAccessor90.invoke(Unknown Source) > at > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at org.apache.hadoop.hbase.fs.HFileSystem$1.invoke(HFileSystem.java:294) > at com.sun.proxy.$Proxy15.getAdditionalDatanode(Unknown Source) > at > > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.addDatanode2ExistingPipeline(DFSOutputStream.java:919) > at > > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:1031) > at > > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.processDatanodeError(DFSOutputStream.java:823) > at > > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:475) > 2015-04-15 05:19:04,221 INFO [Thread-9] regionserver.ShutdownHook: > Shutdown hook finished. > > ------------------------------------------------------------------------------------------------------------ > > The error lead to my hbase cluster shutdwon some RS everyday. > > I do not find any answer when I check maillist and issues. > > so , How can I solve this problem? > > thanks..... > > > > maven hou > On the road >
