Hadoop-Hdfs-22-branch - Build # 102 - Still Failing

2011-10-23 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Hdfs-22-branch/102/

###
## LAST 60 LINES OF THE CONSOLE 
###
[...truncated 22049 lines...]
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@3d1ccd" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@1041911" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@2c86d2" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@1df53c9" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@17a872b" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@b497e4" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@847976" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] Running 
org.apache.hadoop.hdfs.server.datanode.TestFiDataXceiverServer
[junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
[junit] Test org.apache.hadoop.hdfs.server.datanode.TestFiDataXceiverServer 
FAILED (crashed)
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestFiPipelineClose
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 34.793 sec

checkfailure:
[touch] Creating 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/testsfailed

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:762:
 The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:514:
 The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/src/test/aop/build/aop.xml:230:
 The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:699:
 The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:673:
 The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:741:
 Tests failed!

Total time: 122 minutes 10 seconds
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording f

Jenkins build is still unstable: Hadoop-Hdfs-0.23-Build #48

2011-10-23 Thread Apache Jenkins Server
See 




Hadoop-Hdfs-0.23-Build - Build # 48 - Still Unstable

2011-10-23 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/48/

###
## LAST 60 LINES OF THE CONSOLE 
###
[...truncated 9788 lines...]
[WARNING] Assembly file: 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-0.23.0-SNAPSHOT
 is not a regular file (it may be a directory). It cannot be attached to the 
project build for installation or deployment.
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-hdfs ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (tar) @ hadoop-hdfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs 
---
[INFO] 
[INFO] There are 9014 checkstyle errors.
[WARNING] Unable to locate Source XRef to link to - DISABLED
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs ---
[INFO] ** FindBugsMojo execute ***
[INFO] canGenerate is true
[INFO] ** FindBugsMojo executeFindbugs ***
[INFO] Temp File is 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-0.23-Build/trunk/hadoop-hdfs-project/hadoop-hdfs/target/findbugsTemp.xml
[INFO] Fork Value is true
[INFO] xmlOutput is false
[INFO] 
[INFO] 
[INFO] Building Apache Hadoop HDFS Project 0.23.0-SNAPSHOT
[INFO] 
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project 
---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ 
hadoop-hdfs-project ---
[INFO] ** FindBugsMojo execute ***
[INFO] canGenerate is false
[INFO] 
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS  SUCCESS [3:30.604s]
[INFO] Apache Hadoop HDFS Project  SUCCESS [0.056s]
[INFO] 
[INFO] BUILD SUCCESS
[INFO] 
[INFO] Total time: 3:31.099s
[INFO] Finished at: Sun Oct 23 11:36:54 UTC 2011
[INFO] Final Memory: 59M/756M
[INFO] 
+ /home/jenkins/tools/maven/latest/bin/mvn test 
-Dmaven.test.failure.ignore=true -Pclover 
-DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Publishing Clover coverage report...
Publishing Clover HTML report...
Publishing Clover XML report...
Publishing Clover coverage results...
Recording test results
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Publishing Javadoc
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Unstable
Sending email for trigger: Unstable



###
## FAILED TESTS (if any) 
##
1 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestDfsOverAvroRpc.testWorkingDirectory

Error Message:
Two methods with same name: delete

Stack Trace:
org.apache.avro.AvroTypeException: Two methods with same name: delete
at org.apache.avro.reflect.ReflectData.getProtocol(ReflectData.java:394)
at 
org.apache.avro.ipc.reflect.ReflectResponder.(ReflectResponder.java:36)
at 
org.apache.hadoop.ipc.AvroRpcEngine.createResponder(AvroRpcEngine.java:189)
at 
org.apache.hadoop.ipc.AvroRpcEngine$TunnelResponder.(AvroRpcEngine.java:196)
at org.apache.hadoop.ipc.AvroRpcEngine.getServer(AvroRpcEngine.java:232)
at org.apache.hadoop.ipc.RPC.getServer(RPC.java:550)
at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.(NameNodeRpcServer.java:144)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.createRpcServer(NameNode.java:350)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:328)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:452)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:444)
at 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:742)
at 
org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:641)
at 
org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:545)
at org.apache.hadoop.hdfs.MiniDFSCluster.(MiniDFSCluster.java:261)
at org.apache.hadoop.hdfs.MiniDFSCluster.(MiniDFSCluster.java:85)
at 
or

Build failed in Jenkins: Hadoop-Hdfs-trunk #840

2011-10-23 Thread Apache Jenkins Server
See 

--
[...truncated 9940 lines...]
  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 

  [javadoc] [loading 


Hadoop-Hdfs-trunk - Build # 840 - Still Failing

2011-10-23 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/840/

###
## LAST 60 LINES OF THE CONSOLE 
###
[...truncated 10133 lines...]
[ERROR] 
[ERROR] [2] [INFO] File: 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml
 does not exist.
[ERROR] 
[ERROR] [3] [INFO] Invalid artifact specification: 
'hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml'. Must contain 
at least three fields, separated by ':'.
[ERROR] 
[ERROR] [4] [INFO] Failed to resolve classpath resource: 
/assemblies/hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml from 
classloader: 
ClassRealm[plugin>org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-3, 
parent: sun.misc.Launcher$AppClassLoader@126b249]
[ERROR] 
[ERROR] [5] [INFO] Failed to resolve classpath resource: 
hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml from 
classloader: 
ClassRealm[plugin>org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-3, 
parent: sun.misc.Launcher$AppClassLoader@126b249]
[ERROR] 
[ERROR] [6] [INFO] File: 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml
 does not exist.
[ERROR] 
[ERROR] [7] [INFO] Building URL from location: 
hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml
[ERROR] Error:
[ERROR] java.net.MalformedURLException: no protocol: 
hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml
[ERROR] at java.net.URL.(URL.java:567)
[ERROR] at java.net.URL.(URL.java:464)
[ERROR] at java.net.URL.(URL.java:413)
[ERROR] at 
org.apache.maven.shared.io.location.URLLocatorStrategy.resolve(URLLocatorStrategy.java:54)
[ERROR] at org.apache.maven.shared.io.location.Locator.resolve(Locator.java:81)
[ERROR] at 
org.apache.maven.plugin.assembly.io.DefaultAssemblyReader.addAssemblyFromDescriptor(DefaultAssemblyReader.java:309)
[ERROR] at 
org.apache.maven.plugin.assembly.io.DefaultAssemblyReader.readAssemblies(DefaultAssemblyReader.java:140)
[ERROR] at 
org.apache.maven.plugin.assembly.mojos.AbstractAssemblyMojo.execute(AbstractAssemblyMojo.java:328)
[ERROR] at 
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
[ERROR] at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
[ERROR] at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
[ERROR] at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
[ERROR] at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
[ERROR] at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
[ERROR] at 
org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
[ERROR] at 
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
[ERROR] at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:319)
[ERROR] at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
[ERROR] at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
[ERROR] at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
[ERROR] at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
[ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[ERROR] at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[ERROR] at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[ERROR] at java.lang.reflect.Method.invoke(Method.java:597)
[ERROR] at 
org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
[ERROR] at 
org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
[ERROR] at 
org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
[ERROR] at 
org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test 
-Dmaven.test.failure.ignore=true -Pclover 
-DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Publishing Javadoc
Recording fingerprints
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



#

[jira] [Created] (HDFS-2494) [webhdfs] When Getting the file using OP=OPEN with DN http address, ESTABLISHED sockets are growing.

2011-10-23 Thread Uma Maheswara Rao G (Created) (JIRA)
[webhdfs] When Getting the file using OP=OPEN with DN http address, ESTABLISHED 
sockets are growing.


 Key: HDFS-2494
 URL: https://issues.apache.org/jira/browse/HDFS-2494
 Project: Hadoop HDFS
  Issue Type: Bug
  Components: data-node
Affects Versions: 0.24.0
Reporter: Uma Maheswara Rao G
Assignee: Uma Maheswara Rao G


As part of the reliable test,
Scenario:
Initially check the socket count. ---there are aroud 42 sockets are there.
open the file with DataNode http address using op=OPEN request parameter about 
500 times in loop.
Wait for some time and check the socket count. --- There are thousands of 
ESTABLISHED sockets are growing. ~2052

Here is the netstat result:

C:\Users\uma>netstat | grep 127.0.0.1 | grep ESTABLISHED |wc -l
2042
C:\Users\uma>netstat | grep 127.0.0.1 | grep ESTABLISHED |wc -l
2042
C:\Users\uma>netstat | grep 127.0.0.1 | grep ESTABLISHED |wc -l
2042
C:\Users\uma>netstat | grep 127.0.0.1 | grep ESTABLISHED |wc -l
2042
C:\Users\uma>netstat | grep 127.0.0.1 | grep ESTABLISHED |wc -l
2042
C:\Users\uma>netstat | grep 127.0.0.1 | grep ESTABLISHED |wc -l
2042
C:\Users\uma>netstat | grep 127.0.0.1 | grep ESTABLISHED |wc -l
2042
C:\Users\uma>netstat | grep 127.0.0.1 | grep ESTABLISHED |wc -l
2042
C:\Users\uma>netstat | grep 127.0.0.1 | grep ESTABLISHED |wc -l
2042

This count is coming down.




--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




Hadoop-Hdfs-22-branch - Build # 103 - Still Failing

2011-10-23 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Hdfs-22-branch/103/

###
## LAST 60 LINES OF THE CONSOLE 
###
[...truncated 18867 lines...]
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@10de6a6" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@4cec7" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@5d7490" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@13f2745" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@e7a256" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@748771" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] at java.lang.Thread.run(Thread.java:662)
[junit] Exception in thread 
"org.apache.hadoop.hdfs.server.datanode.DataXceiver@1ddbdcd" 
java.lang.OutOfMemoryError: Pretend there's no more memory
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run_aroundBody1$advice(DataXceiver.java:36)
[junit] at 
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:1)
[junit] Running 
org.apache.hadoop.hdfs.server.datanode.TestFiDataXceiverServer
[junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
[junit] Test org.apache.hadoop.hdfs.server.datanode.TestFiDataXceiverServer 
FAILED (crashed)
[junit] Running org.apache.hadoop.hdfs.server.datanode.TestFiPipelineClose
[junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 35 sec

checkfailure:
[touch] Creating 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/testsfailed

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:762:
 The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:514:
 The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/src/test/aop/build/aop.xml:230:
 The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:699:
 The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:673:
 The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:741:
 Tests failed!

Total time: 165 minutes 48 seconds
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts