See <https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/31/changes>

Changes:

[vinodkv] MAPREDUCE-2913. svn merge -c r1179319 --ignore-ancestry ../../trunk/

[vinodkv] MAPREDUCE-2738. svn merge -c r1179229 --ignore-ancestry ../../trunk/

[vinodkv] MAPREDUCE-2702. svn merge -c r1179188 --ignore-ancestry ../../trunk/

[vinodkv] MAPREDUCE-2907. svn merge -c r1179178 --ignore-ancestry ../../trunk/

[vinodkv] MAPREDUCE-3013. svn merge -c r1179174 --ignore-ancestry ../../trunk/

------------------------------------------
[...truncated 6967 lines...]
[INFO] Compiling 4 source files to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common/target/test-classes>
[INFO] 
[INFO] --- maven-surefire-plugin:2.7.2:test (default-test) @ 
hadoop-yarn-server-common ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-yarn-server-common 
---
[INFO] Building jar: 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common/target/hadoop-yarn-server-common-0.23.0-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (santize-pom) @ 
hadoop-yarn-server-common ---
[INFO] Executing tasks

main:
     [echo] project.build.directory: 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common/target>
     [copy] Copying 1 file to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common/target>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ 
hadoop-yarn-server-common ---
[INFO] Installing 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common/target/hadoop-yarn-server-common-0.23.0-SNAPSHOT.jar>
 to 
/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/0.23.0-SNAPSHOT/hadoop-yarn-server-common-0.23.0-SNAPSHOT.jar
[INFO] Installing 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common/pom.xml>
 to 
/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/0.23.0-SNAPSHOT/hadoop-yarn-server-common-0.23.0-SNAPSHOT.pom
[INFO] 
[INFO] --- maven-install-plugin:2.3.1:install-file (install-sanitized-pom) @ 
hadoop-yarn-server-common ---
[INFO] Installing 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common/target/hadoop-yarn-server-common-0.23.0-SNAPSHOT.jar>
 to 
/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/0.23.0-SNAPSHOT/hadoop-yarn-server-common-0.23.0-SNAPSHOT.jar
[INFO] Installing 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-common/target/saner-pom.xml>
 to 
/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-yarn-server-common/0.23.0-SNAPSHOT/hadoop-yarn-server-common-0.23.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building hadoop-yarn-server-nodemanager 0.23.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ 
hadoop-yarn-server-nodemanager ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run 
(create-protobuf-generated-sources-directory) @ hadoop-yarn-server-nodemanager 
---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/generated-sources/proto>
[INFO] Executed tasks
[INFO] 
[INFO] --- exec-maven-plugin:1.2:exec (generate-sources) @ 
hadoop-yarn-server-nodemanager ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.5:add-source (add-source) @ 
hadoop-yarn-server-nodemanager ---
[INFO] Source directory: 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/generated-sources/proto>
 added.
[INFO] 
[INFO] --- maven-resources-plugin:2.4.3:resources (default-resources) @ 
hadoop-yarn-server-nodemanager ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO] 
[INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @ 
hadoop-yarn-server-nodemanager ---
[INFO] Compiling 116 source files to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/classes>
[INFO] 
[INFO] --- maven-resources-plugin:2.4.3:testResources (default-testResources) @ 
hadoop-yarn-server-nodemanager ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/src/test/resources>
[INFO] 
[INFO] --- maven-compiler-plugin:2.3.2:testCompile (default-testCompile) @ 
hadoop-yarn-server-nodemanager ---
[INFO] Compiling 31 source files to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/test-classes>
[INFO] 
[INFO] --- maven-surefire-plugin:2.7.2:test (default-test) @ 
hadoop-yarn-server-nodemanager ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ 
hadoop-yarn-server-nodemanager ---
[INFO] Building jar: 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target/hadoop-yarn-server-nodemanager-0.23.0-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (santize-pom) @ 
hadoop-yarn-server-nodemanager ---
[INFO] Executing tasks

main:
     [echo] project.build.directory: 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target>
     [copy] Copying 1 file to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/target>
[INFO] Executed tasks
[INFO] 
[INFO] --- make-maven-plugin:1.0-beta-1:autoreconf (autoreconf) @ 
hadoop-yarn-server-nodemanager ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Project POM ......................... SUCCESS [0.588s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [1.181s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.173s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.076s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [2.000s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [0.932s]
[INFO] Apache Hadoop Common .............................. SUCCESS [25.045s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.003s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [16.286s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.004s]
[INFO] hadoop-yarn-api ................................... SUCCESS [6.583s]
[INFO] hadoop-yarn-common ................................ SUCCESS [8.553s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [2.988s]
[INFO] hadoop-yarn-server-nodemanager .................... FAILURE [5.203s]
[INFO] hadoop-yarn-server-resourcemanager ................ SKIPPED
[INFO] hadoop-yarn-server-tests .......................... SKIPPED
[INFO] hadoop-yarn-server ................................ SKIPPED
[INFO] hadoop-yarn-applications-distributedshell ......... SKIPPED
[INFO] hadoop-yarn-applications .......................... SKIPPED
[INFO] hadoop-yarn-site .................................. SKIPPED
[INFO] hadoop-yarn ....................................... SKIPPED
[INFO] hadoop-mapreduce-client-core ...................... SKIPPED
[INFO] hadoop-mapreduce-client-common .................... SKIPPED
[INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED
[INFO] hadoop-mapreduce-client-app ....................... SKIPPED
[INFO] hadoop-mapreduce-client-hs ........................ SKIPPED
[INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED
[INFO] hadoop-mapreduce-client ........................... SKIPPED
[INFO] hadoop-mapreduce .................................. SKIPPED
[INFO] Apache Hadoop Main ................................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:10.274s
[INFO] Finished at: Thu Oct 06 11:33:16 UTC 2011
[INFO] Final Memory: 109M/754M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.codehaus.mojo:make-maven-plugin:1.0-beta-1:autoreconf (autoreconf) on 
project hadoop-yarn-server-nodemanager: autoreconf command returned an exit 
value != 0. Aborting build; see debug output for more information. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-yarn-server-nodemanager
+ cd hadoop-hdfs-project
+ /home/jenkins/tools/maven/latest/bin/mvn clean verify checkstyle:checkstyle 
findbugs:findbugs -DskipTests -Pdist -Dtar -Psrc -Pnative -Pdocs
[INFO] Scanning for projects...
[WARNING] 
[WARNING] Some problems were encountered while building the effective model for 
org.apache.hadoop:hadoop-hdfs:jar:0.23.0-SNAPSHOT
[WARNING] 'build.plugins.plugin.(groupId:artifactId)' must be unique but found 
duplicate declaration of plugin org.apache.maven.plugins:maven-antrun-plugin @ 
line 302, column 15
[WARNING] 
[WARNING] It is highly recommended to fix these problems because they threaten 
the stability of your build.
[WARNING] 
[WARNING] For this reason, future Maven versions might no longer support 
building such malformed projects.
[WARNING] 
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] Apache Hadoop HDFS
[INFO] Apache Hadoop HDFS Project
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS 0.23.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs ---
[INFO] Deleting 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target>
[INFO] 
[INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (hdfs) @ hadoop-hdfs ---
[WARNING] Compiled JSPs will not be added to the project and web.xml will not 
be modified, either because includeInProject is set to false or because the 
project's packaging is not 'war'.
Created dir: 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp>
Created dir: 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/classes>
[INFO] Compiling 8 JSP source files to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp>
log4j:WARN No appenders could be found for logger (org.apache.jasper.JspC).
log4j:WARN Please initialize the log4j system properly.
WARN: The method class 
org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked.
WARN: Please see http://www.slf4j.org/codes.html for an explanation.
[INFO] Compiled completed in 0:00:00.274
[INFO] 
[INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (secondary) @ hadoop-hdfs ---
[WARNING] Compiled JSPs will not be added to the project and web.xml will not 
be modified, either because includeInProject is set to false or because the 
project's packaging is not 'war'.
[INFO] Compiling 1 JSP source file to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp>
WARN: The method class 
org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked.
WARN: Please see http://www.slf4j.org/codes.html for an explanation.
[INFO] Compiled completed in 0:00:00.017
[INFO] 
[INFO] --- jspc-maven-plugin:2.0-alpha-3:compile (datanode) @ hadoop-hdfs ---
[WARNING] Compiled JSPs will not be added to the project and web.xml will not 
be modified, either because includeInProject is set to false or because the 
project's packaging is not 'war'.
[INFO] Compiling 3 JSP source files to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp>
WARN: The method class 
org.apache.commons.logging.impl.SLF4JLogFactory#release() was invoked.
WARN: Please see http://www.slf4j.org/codes.html for an explanation.
[INFO] Compiled completed in 0:00:00.023
[INFO] 
[INFO] --- build-helper-maven-plugin:1.5:add-source (add-source) @ hadoop-hdfs 
---
[INFO] Source directory: 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/generated-src/main/jsp>
 added.
[INFO] 
[INFO] --- maven-resources-plugin:2.4.3:resources (default-resources) @ 
hadoop-hdfs ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @ hadoop-hdfs 
---
[INFO] Compiling 286 source files to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/classes>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-web-xmls) @ hadoop-hdfs ---
[INFO] Executing tasks

main:
     [copy] Copying 1 file to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/hdfs/WEB-INF>
     [copy] Copying 1 file to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/secondary/WEB-INF>
     [copy] Copying 1 file to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps/datanode/WEB-INF>
     [copy] Copying 6 files to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/webapps>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (compile) @ hadoop-hdfs ---
[INFO] Executing tasks

main:
     [copy] Copying 15 files to 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native>
     [copy] Copied 6 empty directories to 2 empty directories under 
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/native>
[INFO] Executed tasks
[INFO] 
[INFO] --- make-maven-plugin:1.0-beta-1:autoreconf (compile) @ hadoop-hdfs ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [10.182s]
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 10.633s
[INFO] Finished at: Thu Oct 06 11:33:28 UTC 2011
[INFO] Final Memory: 25M/265M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.codehaus.mojo:make-maven-plugin:1.0-beta-1:autoreconf (compile) on project 
hadoop-hdfs: autoreconf command returned an exit value != 0. Aborting build; 
see debug output for more information. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test 
-Dmaven.test.failure.ignore=true -Pclover 
-DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Archiving artifacts
Publishing Clover coverage report...
Publishing Clover HTML report...
Publishing Clover XML report...
Publishing Clover coverage results...
Recording test results
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Publishing Javadoc
ERROR: Publisher hudson.tasks.JavadocArchiver aborted due to exception
<https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/site/api>
 does not exist.
        at 
org.apache.tools.ant.types.AbstractFileSet.getDirectoryScanner(AbstractFileSet.java:474)
        at hudson.FilePath$34.hasMatch(FilePath.java:1801)
        at hudson.FilePath$34.invoke(FilePath.java:1710)
        at hudson.FilePath$34.invoke(FilePath.java:1701)
        at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1995)
        at hudson.remoting.UserRequest.perform(UserRequest.java:118)
        at hudson.remoting.UserRequest.perform(UserRequest.java:48)
        at hudson.remoting.Request$2.run(Request.java:287)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
        at java.util.concurrent.FutureTask.run(FutureTask.java:138)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
Recording fingerprints
Updating MAPREDUCE-2702
Updating MAPREDUCE-2913
Updating MAPREDUCE-2907
Updating MAPREDUCE-3013
Updating MAPREDUCE-2738

Reply via email to