See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1526/changes>
Changes: [jing9] HDFS-5212. Refactor RpcMessage and NFS3Response to support different types of authentication information. Contributed by Jing Zhao. [brandonli] HADOOP-9669 Reduce the number of byte array creations and copies in XDR data manipulation. Contributed by Haohui Mai ------------------------------------------ [...truncated 10014 lines...] [INFO] Apache Hadoop Client .............................. SUCCESS [0.674s] [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.179s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1:39.660s [INFO] Finished at: Wed Sep 18 11:34:01 UTC 2013 [INFO] Final Memory: 77M/748M [INFO] ------------------------------------------------------------------------ + cd hadoop-hdfs-project + /home/jenkins/tools/maven/latest/bin/mvn clean verify checkstyle:checkstyle findbugs:findbugs -Pdist -Pnative -Dtar -Pdocs -fae [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] Apache Hadoop HDFS [INFO] Apache Hadoop HttpFS [INFO] Apache Hadoop HDFS BookKeeper Journal [INFO] Apache Hadoop HDFS-NFS [INFO] Apache Hadoop HDFS Project [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Hadoop HDFS 3.0.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs --- [INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target> [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs --- [INFO] Executing tasks main: [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/test-dir> [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/test/data> [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-jsp-generated-sources-directory) @ hadoop-hdfs --- [INFO] Executing tasks main: [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java> [INFO] Executed tasks [INFO] [INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (hdfs) @ hadoop-hdfs --- [WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/classes> [INFO] Compiling 8 JSP source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java> log4j:WARN No appenders could be found for logger (org.apache.jasper.JspC). log4j:WARN Please initialize the log4j system properly. WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. WARN: Please see http://www.slf4j.org/codes.html for an explanation. [INFO] Compiled completed in 0:00:00.312 [INFO] [INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (secondary) @ hadoop-hdfs --- [WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. [INFO] Compiling 1 JSP source file to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java> WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. WARN: Please see http://www.slf4j.org/codes.html for an explanation. [INFO] Compiled completed in 0:00:00.017 [INFO] [INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (journal) @ hadoop-hdfs --- [WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. [INFO] Compiling 1 JSP source file to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java> WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. WARN: Please see http://www.slf4j.org/codes.html for an explanation. [INFO] Compiled completed in 0:00:00.017 [INFO] [INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (datanode) @ hadoop-hdfs --- [WARNING] Compiled JSPs will not be added to the project and web.xml will not be modified, either because includeInProject is set to false or because the project's packaging is not 'war'. [INFO] Compiling 4 JSP source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java> WARN: The method class org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked. WARN: Please see http://www.slf4j.org/codes.html for an explanation. [INFO] Compiled completed in 0:00:00.063 [INFO] [INFO] --- build-helper-maven-plugin:1.5:add-source (add-jsp-generated-sources-directory) @ hadoop-hdfs --- [INFO] Source directory: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-sources/java> added. [INFO] [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @ hadoop-hdfs --- [INFO] [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc-datanode) @ hadoop-hdfs --- [INFO] [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc-namenode) @ hadoop-hdfs --- [INFO] [INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc-qjournal) @ hadoop-hdfs --- [INFO] [INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-hdfs --- [INFO] Using default encoding to copy filtered resources. [INFO] [INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-hdfs --- [INFO] Compiling 521 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/classes> [WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:[32,48] com.sun.org.apache.xml.internal.serialize.OutputFormat is Sun proprietary API and may be removed in a future release [WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:[33,48] com.sun.org.apache.xml.internal.serialize.XMLSerializer is Sun proprietary API and may be removed in a future release [WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:[55,4] com.sun.org.apache.xml.internal.serialize.OutputFormat is Sun proprietary API and may be removed in a future release [WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:[55,33] com.sun.org.apache.xml.internal.serialize.OutputFormat is Sun proprietary API and may be removed in a future release [WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:[59,4] com.sun.org.apache.xml.internal.serialize.XMLSerializer is Sun proprietary API and may be removed in a future release [WARNING] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:[59,35] com.sun.org.apache.xml.internal.serialize.XMLSerializer is Sun proprietary API and may be removed in a future release [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-web-xmls) @ hadoop-hdfs --- [INFO] Executing tasks main: [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/hdfs/WEB-INF> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/secondary/WEB-INF> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/datanode/WEB-INF> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/journal/WEB-INF> [copy] Copying 9 files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps> [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.6:run (make) @ hadoop-hdfs --- [INFO] Executing tasks main: [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native> [exec] -- The C compiler identification is GNU [exec] -- The CXX compiler identification is GNU [exec] -- Check for working C compiler: /usr/bin/gcc [exec] CMake Error at /usr/share/cmake-2.8/Modules/CMakeTestCCompiler.cmake:50 (MESSAGE): [exec] The C compiler "/usr/bin/gcc" is not able to compile a simple test program. [exec] [exec] It fails with the following output: [exec] [exec] Change Dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native/CMakeFiles/CMakeTmp> [exec] [exec] [exec] [exec] Run Build Command:/usr/bin/make "cmTryCompileExec/fast" [exec] [exec] /usr/bin/make -f CMakeFiles/cmTryCompileExec.dir/build.make [exec] CMakeFiles/cmTryCompileExec.dir/build [exec] [exec] make[1]: Entering directory [exec] `<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native/CMakeFiles/CMakeTmp'> [exec] [exec] [exec] /usr/bin/cmake -E cmake_progress_report [exec] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native/CMakeFiles/CMakeTmp/CMakeFiles> [exec] 1 [exec] [exec] Building C object CMakeFiles/cmTryCompileExec.dir/testCCompiler.c.o [exec] [exec] /usr/bin/gcc -o CMakeFiles/cmTryCompileExec.dir/testCCompiler.c.o -c [exec] <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native/CMakeFiles/CMakeTmp/testCCompiler.c> [exec] [exec] [exec] Linking C executable cmTryCompileExec [exec] [exec] /usr/bin/cmake -E cmake_link_script [exec] CMakeFiles/cmTryCompileExec.dir/link.txt --verbose=1 [exec] [exec] /usr/bin/gcc CMakeFiles/cmTryCompileExec.dir/testCCompiler.c.o -o [exec] cmTryCompileExec -rdynamic [exec] [exec] gcc: vfork: Resource-- Check for working C compiler: /usr/bin/gcc -- broken [exec] -- Configuring incomplete, errors occurred! [exec] temporarily unavailable [exec] [exec] make[1]: *** [cmTryCompileExec] Error 1 [exec] [exec] make[1]: Leaving directory [exec] `<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native/CMakeFiles/CMakeTmp'> [exec] [exec] [exec] make: *** [cmTryCompileExec/fast] Error 2 [exec] [exec] [exec] [exec] [exec] [exec] CMake will not be able to correctly generate this project. [exec] Call Stack (most recent call first): [exec] [exec] [exec] [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Skipping Apache Hadoop HttpFS [INFO] This project has been banned from the build due to previous failures. [INFO] ------------------------------------------------------------------------ [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Skipping Apache Hadoop HDFS BookKeeper Journal [INFO] This project has been banned from the build due to previous failures. [INFO] ------------------------------------------------------------------------ [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Skipping Apache Hadoop HDFS-NFS [INFO] This project has been banned from the build due to previous failures. [INFO] ------------------------------------------------------------------------ [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available [WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 [INFO] [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project --- [INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target> [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project --- [INFO] Executing tasks main: [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir> [INFO] Executed tasks [INFO] [INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project --- [INFO] Not executing Javadoc as the project is not a Java classpath-capable package [INFO] [INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project --- [INFO] [INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project --- [INFO] ****** FindBugsMojo execute ******* [INFO] canGenerate is false [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop HDFS ................................ FAILURE [41.797s] [INFO] Apache Hadoop HttpFS .............................. SKIPPED [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED [INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.714s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 44.363s [INFO] Finished at: Wed Sep 18 11:34:47 UTC 2013 [INFO] Final Memory: 51M/544M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (make) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1 -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException Build step 'Execute shell' marked build as failure Archiving artifacts Updating HDFS-5212 Updating HADOOP-9669