Hi all, My name is Vijay Bhat and I am looking to contribute to the Hadoop YARN project. I have been using and benefiting from Hadoop ecosystem technologies for a few years now and I want to give back to the community that makes this happen.
I forked the apache/hadoop branch on github and synced to the last commit (https://github.com/apache/hadoop/commit/1556f86a31a54733d6550363aa0e027acca7823b) that successfully built on the Apache build server (https://builds.apache.org/view/All/job/Hadoop-Yarn-trunk/758/). However, I get test case failures when I build the Hadoop source code on a VM running Ubuntu 12.04 LTS. The maven command I am running from the hadoop base directory is: mvn clean install -U Console output Tests run: 9, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 2.392 sec <<< FAILURE! - in org.apache.hadoop.ipc.TestDecayRpcScheduler testAccumulate(org.apache.hadoop.ipc.TestDecayRpcScheduler) Time elapsed: 0.084 sec <<< FAILURE! java.lang.AssertionError: expected:<3> but was:<2> at org.junit.Assert.fail(Assert.java:88) at org.junit.Assert.failNotEquals(Assert.java:743) at org.junit.Assert.assertEquals(Assert.java:118) at org.junit.Assert.assertEquals(Assert.java:555) at org.junit.Assert.assertEquals(Assert.java:542) at org.apache.hadoop.ipc.TestDecayRpcScheduler.testAccumulate(TestDecayRpcScheduler.java:136) testPriority(org.apache.hadoop.ipc.TestDecayRpcScheduler) Time elapsed: 0.052 sec <<< FAILURE! java.lang.AssertionError: expected:<1> but was:<0> at org.junit.Assert.fail(Assert.java:88) at org.junit.Assert.failNotEquals(Assert.java:743) at org.junit.Assert.assertEquals(Assert.java:118) at org.junit.Assert.assertEquals(Assert.java:555) at org.junit.Assert.assertEquals(Assert.java:542) at org.apache.hadoop.ipc.TestDecayRpcScheduler.testPriority(TestDecayRpcScheduler.java:197) Tests run: 3, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 111.519 sec <<< FAILURE! - in org.apache.hadoop.ha.TestZKFailoverControllerStress testExpireBackAndForth(org.apache.hadoop.ha.TestZKFailoverControllerStress) Time elapsed: 45.46 sec <<< ERROR! java.lang.Exception: test timed out after 40000 milliseconds at java.lang.Thread.sleep(Native Method) at org.apache.hadoop.ha.MiniZKFCCluster.waitForHAState(MiniZKFCCluster.java:164) at org.apache.hadoop.ha.MiniZKFCCluster.expireAndVerifyFailover(MiniZKFCCluster.java:236) at org.apache.hadoop.ha.TestZKFailoverControllerStress.testExpireBackAndForth(TestZKFailoverControllerStress.java:79) Tests run: 19, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 62.514 sec <<< FAILURE! - in org.apache.hadoop.ha.TestZKFailoverController testGracefulFailoverFailBecomingStandby(org.apache.hadoop.ha.TestZKFailoverController) Time elapsed: 15.062 sec <<< ERROR! java.lang.Exception: test timed out after 15000 milliseconds at java.lang.Object.wait(Native Method) at org.apache.hadoop.ha.ZKFailoverController.waitForActiveAttempt(ZKFailoverController.java:467) at org.apache.hadoop.ha.ZKFailoverController.doGracefulFailover(ZKFailoverController.java:657) at org.apache.hadoop.ha.ZKFailoverController.access$400(ZKFailoverController.java:61) at org.apache.hadoop.ha.ZKFailoverController$3.run(ZKFailoverController.java:602) at org.apache.hadoop.ha.ZKFailoverController$3.run(ZKFailoverController.java:599) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1683) at org.apache.hadoop.ha.ZKFailoverController.gracefulFailoverToYou(ZKFailoverController.java:599) at org.apache.hadoop.ha.ZKFCRpcServer.gracefulFailover(ZKFCRpcServer.java:94) at org.apache.hadoop.ha.TestZKFailoverController.testGracefulFailoverFailBecomingStandby(TestZKFailoverController.java:532) When I skip the tests, the source code compiles successfully. mvn clean install -U –DskipTests Is there something I’m doing incorrectly that’s causing the test cases to fail? I’d really appreciate any insight from folks who have gone through this process before. I’ve looked at the JIRAs labeled newbie (http://wiki.apache.org/hadoop/HowToContribute) but didn’t find promising leads. Thanks for the help! -Vijay ________________________________________________________ The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed. If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.