Re: which part of Hadoop is responsible of distributing the input file fragments to datanodes?

2012-11-15 Thread Yanbo Liang
I guess you means to set your own strategy of block distribution. If this, just hack the code as following clue: FSNamesystem.getAdditionalBlock() ---> BlockManager.chooseTarget() ---> BlockPlacementPolicy.chooseTarget(). And you need to implement your own BlockPlacementPolicy. Then if the client

Re: Unable to build Hadoop from source

2012-09-29 Thread Yanbo Liang
It means that there are one failure test case in your build. You can go to hadoop-common-project/hadoop-common/target/surefile-report/ directory to check to output and log in order to find the reason of failure. As far as I know, testDelegationTokenSecretManager failure may caused by missing some

Should use Filesystem.setPermission() rather than File.setWritable() to change a file access permission.

2012-09-24 Thread Yanbo Liang
Hi all, In the current test case of hadoop, if we want to corrupt or disable one directory or file, we use File.setWritable(false) such as in TestStorageRestore.java and TestCheckpoint.java. But we all know, the implementation of function setWritable() in Java API is system-dependent, and there a

10 failures when run test on hadoop-common-project

2012-07-23 Thread Yanbo Liang
Hi All, I just run the test code of hadoop-common-project with revision 1364560. It produced 10 FAILURES after I typed "mvn test" under the folder of "hadoop-common-project". But the lastest build in the Jenkins server is Hadoop-Common-trunk#480, and there is no test failures or errors occurred.