If block is missing and all datanodes have checked in, then its gone.
Grep the block name in namednode to get sense of the blocks history.
Can you figure what happened to it?
St.Ack
On Sun, Dec 26, 2010 at 2:47 AM, SingoWong wrote:
> but how to slove? fsck only to show which paths are corrupt, bu
but how to slove? fsck only to show which paths are corrupt, but how can i
fix or still the missing blocks?
Regards,
Singo
On Sat, Dec 25, 2010 at 6:49 AM, Ryan Rawson wrote:
> Missing blocks = missing datanodes (or datanode/data directory
> corruption). You can run bin/hadoop fsck / to see wh
Missing blocks = missing datanodes (or datanode/data directory
corruption). You can run bin/hadoop fsck / to see what was affected.
I generally run in to this when not all datanodes have joined the namenode.
'fsck' will show you which paths are "corrupt" (aka missing replicas).
Good luck!
-ryan
Hi,
In my DFS home page, display the message below
WARNING : There are about 7179 missing blocks. Please check the log or run
fsck.
who can tell me what this message and how to fix than? bcoz so far i cannt
get and analyse the file in hadoop.
And i check the log file,
2010-12-25 06:38:41,787 INF