I have took a look at the test and I should note that the way they written if kinda misleading. For instance the message we are seeing in the Hudson says expected:<403> but was:<200>
where's the reality is that the expected was <200> and actual value was <403>. Basically the order of assert calls is reversed in a number of places. While this isn't a cause of the failure it is confuses the analysis. That's be great to see a maintainer of this component to take a look at the failures so we can eventually have a green HDFS bulld. I have opened https://issues.apache.org/jira/browse/HDFS-1666 to track it Cos On Thu, Feb 24, 2011 at 10:14AM, Todd Lipcon wrote: > Can someone familiar with hdfsproxy look into this consistent unit test > failure? People voted in support of keeping this contrib, but it would be > easier to be satisfied with that decision if someone stepped up to fix these > tests that have been failing for quite some time. > > -Todd > > ---------- Forwarded message ---------- > From: Apache Hudson Server <hud...@hudson.apache.org> > Date: Thu, Feb 24, 2011 at 4:36 AM > Subject: Hadoop-Hdfs-trunk - Build # 591 - Still Failing > To: hdfs-dev@hadoop.apache.org > > > See https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk/591/ > > ################################################################################### > ########################## LAST 60 LINES OF THE CONSOLE > ########################### > [...truncated 719693 lines...] > [mkdir] Created dir: > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/target > [echo] Including clover.jar in the war file ... > [cactifywar] Analyzing war: > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/hdfsproxy-2.0-test.war > [cactifywar] Building war: > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/target/test.war > > cactifywar: > > test-cactus: > [echo] Free Ports: startup-57271 / http-57272 / https-57273 > [echo] Please take a deep breath while Cargo gets the Tomcat for running > the servlet tests... > [mkdir] Created dir: > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/target/tomcat-config > [mkdir] Created dir: > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf > [mkdir] Created dir: > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps > [mkdir] Created dir: > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/target/tomcat-config/temp > [mkdir] Created dir: > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/target/logs > [mkdir] Created dir: > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/target/reports > [copy] Copying 1 file to > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf > [copy] Copying 1 file to > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf > [copy] Copying 1 file to > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/target/tomcat-config/conf > [cactus] ----------------------------------------------------------------- > [cactus] Running tests against Tomcat 5.x @ http://localhost:57272 > [cactus] ----------------------------------------------------------------- > [cactus] Deploying > [/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/target/test.war] > to > [/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/contrib/hdfsproxy/target/tomcat-config/webapps]... > [cactus] Tomcat 5.x starting... > Server [Apache-Coyote/1.1] started > [cactus] WARNING: multiple versions of ant detected in path for junit > [cactus] > > jar:file:/homes/hudson/tools/ant/latest/lib/ant.jar!/org/apache/tools/ant/Project.class > [cactus] and > jar:file:/homes/hudson/.ivy2/cache/ant/ant/jars/ant-1.6.5.jar!/org/apache/tools/ant/Project.class > [cactus] Running org.apache.hadoop.hdfsproxy.TestAuthorizationFilter > [cactus] Tests run: 4, Failures: 2, Errors: 0, Time elapsed: 0.454 sec > [cactus] Test org.apache.hadoop.hdfsproxy.TestAuthorizationFilter FAILED > [cactus] Running org.apache.hadoop.hdfsproxy.TestLdapIpDirFilter > [cactus] Tomcat 5.x started on port [57272] > [cactus] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.32 sec > [cactus] Running org.apache.hadoop.hdfsproxy.TestProxyFilter > [cactus] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.347 sec > [cactus] Running org.apache.hadoop.hdfsproxy.TestProxyForwardServlet > [cactus] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.307 sec > [cactus] Running org.apache.hadoop.hdfsproxy.TestProxyUtil > [cactus] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 1.024 sec > [cactus] Tomcat 5.x is stopping... > [cactus] Tomcat 5.x is stopped > > BUILD FAILED > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:750: > The following error occurred while executing this line: > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:731: > The following error occurred while executing this line: > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/contrib/build.xml:48: > The following error occurred while executing this line: > /grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/contrib/hdfsproxy/build.xml:343: > Tests failed! > > Total time: 59 minutes 43 seconds > [FINDBUGS] Skipping publisher since build result is FAILURE > Publishing Javadoc > Archiving artifacts > Recording test results > Recording fingerprints > Publishing Clover coverage report... > No Clover report will be published due to a Build Failure > Email was triggered for: Failure > Sending email for trigger: Failure > > > > ################################################################################### > ############################## FAILED TESTS (if any) > ############################## > 2 tests failed. > FAILED: org.apache.hadoop.hdfsproxy.TestAuthorizationFilter.testPathPermit > > Error Message: > expected:<403> but was:<200> > > Stack Trace: > junit.framework.AssertionFailedError: expected:<403> but was:<200> > at > org.apache.hadoop.hdfsproxy.TestAuthorizationFilter.endPathPermit(TestAuthorizationFilter.java:113) > at > org.apache.cactus.internal.client.ClientTestCaseCaller.callGenericEndMethod(ClientTestCaseCaller.java:442) > at > org.apache.cactus.internal.client.ClientTestCaseCaller.callEndMethod(ClientTestCaseCaller.java:209) > at > org.apache.cactus.internal.client.ClientTestCaseCaller.runTest(ClientTestCaseCaller.java:149) > at > org.apache.cactus.internal.AbstractCactusTestCase.runBareClient(AbstractCactusTestCase.java:218) > at > org.apache.cactus.internal.AbstractCactusTestCase.runBare(AbstractCactusTestCase.java:134) > > > FAILED: > org.apache.hadoop.hdfsproxy.TestAuthorizationFilter.testPathPermitQualified > > Error Message: > expected:<403> but was:<200> > > Stack Trace: > junit.framework.AssertionFailedError: expected:<403> but was:<200> > at > org.apache.hadoop.hdfsproxy.TestAuthorizationFilter.endPathPermitQualified(TestAuthorizationFilter.java:136) > at > org.apache.cactus.internal.client.ClientTestCaseCaller.callGenericEndMethod(ClientTestCaseCaller.java:442) > at > org.apache.cactus.internal.client.ClientTestCaseCaller.callEndMethod(ClientTestCaseCaller.java:209) > at > org.apache.cactus.internal.client.ClientTestCaseCaller.runTest(ClientTestCaseCaller.java:149) > at > org.apache.cactus.internal.AbstractCactusTestCase.runBareClient(AbstractCactusTestCase.java:218) > at > org.apache.cactus.internal.AbstractCactusTestCase.runBare(AbstractCactusTestCase.java:134) > > > > > > > -- > Todd Lipcon > Software Engineer, Cloudera
signature.asc
Description: Digital signature