Hey folks !! i tried to decommission datanode from hadoop cluster... steps i followed
1: add this in core site * <property> <name>dfs.hosts.exclude</name> <value>/home/hadoop/excludes</value> <final>true</final> </property> * 1: add this in mapred-site * <property> <name>mapred.hosts.exclude</name> <value>/home/hadoop/excludes</value> <final>true</final> </property>* 3:create a excludes file and add *ip:port* in that exp: *10.0.3.31:50010* 4: run cmd *hadoop dfsadmin -refreshNodes* *5: After that my live nodes became 0 and all nodes became dead.. i checked namenode logs where i found these error msgs* 2011-09-19 12:33:47,695 INFO org.apache.hadoop.ipc.Server: IPC Server handler 24 on 9000, call sendHeartbeat(DatanodeRegistration(10.0.3.16:50010, storageID=DS-1703098060-10.0.3.16-50010-1298269611944, infoPort=50075, ipcPort=50020), 2012206694400, 1650194042865, 271003275264, 0, 1) from 10.0.3.16:38587: error: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode denied communication with namenode: 10.0.3.16:50010 org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode denied communication with namenode: 10.0.3.16:50010 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.handleHeartbeat(FSNamesystem.java:2235) at org.apache.hadoop.hdfs.server.namenode.NameNode.sendHeartbeat(NameNode.java:704) at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953) 2011-09-19 12:33:47,701 INFO org.apache.hadoop.ipc.Server: IPC Server handler 7 on 9000, call sendHeartbeat(DatanodeRegistration(10.0.5.36:50010, storageID=DS-809855347-10.0.5.36-50010-1316252293924, infoPort=50075, ipcPort=50020), 1938687860736, 1390486994944, 457712619520, 0, 1) from 10.0.5.36:58924: error: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode denied communication with namenode: 10.0.5.36:50010 org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode denied communication with namenode: 10.0.5.36:50010 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.handleHeartbeat(FSNamesystem.java:2235) at org.apache.hadoop.hdfs.server.namenode.NameNode.sendHeartbeat(NameNode.java:704) at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953) please suggest any help would be appreciated!!!!!!!!!! -- With Regards Vikas Srivastava DWH & Analytics Team Mob:+91 9560885900 One97 | Let's get talking !