See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1477/changes>
Changes: [brandonli] HDFS-5043. For HdfsFileStatus, set default value of childrenNum to -1 instead of 0 to avoid confusing applications. Contributed by Brandon Li [vinodkv] YARN-966. Fixed ContainerLaunch to not fail quietly when there are no localized resources due to some other failure. Contributed by Zhijie Shen. [vinodkv] YARN-948. Changed ResourceManager to validate the release container list before actually releasing them. Contributed by Omkar Vinit Joshi. [vinodkv] MAPREDUCE-5385. Fixed a bug with JobContext getCacheFiles API. Contributed by Omkar Vinit Joshi. [cnauroth] HADOOP-9768. Moving from 2.1.0-beta to 2.1.1-beta in CHANGES.txt, because this patch did not make it into the 2.1.0-beta RC. [acmurthy] Updating releasenotes for hadoop-2.1.0-beta. [acmurthy] Updating release date for hadoop-2.1.0-beta. [acmurthy] Moved HADOOP-9509 & HADOOP-9515 to appropriate release of 2.1.0-beta. ------------------------------------------ [...truncated 15221 lines...] [INFO] Excluding commons-collections:commons-collections:jar:3.2.1 from the shaded jar. [INFO] Excluding commons-digester:commons-digester:jar:1.8 from the shaded jar. [INFO] Excluding commons-beanutils:commons-beanutils:jar:1.7.0 from the shaded jar. [INFO] Excluding commons-beanutils:commons-beanutils-core:jar:1.8.0 from the shaded jar. [INFO] Excluding org.slf4j:slf4j-api:jar:1.6.1 from the shaded jar. [INFO] Excluding org.slf4j:slf4j-log4j12:jar:1.6.1 from the shaded jar. [INFO] Including org.apache.bookkeeper:bookkeeper-server:jar:4.0.0 in the shaded jar. [INFO] Including org.jboss.netty:netty:jar:3.2.4.Final in the shaded jar. [INFO] Including org.apache.zookeeper:zookeeper:jar:3.4.2 in the shaded jar. [INFO] Excluding jline:jline:jar:0.9.94 from the shaded jar. [INFO] Excluding com.google.guava:guava:jar:11.0.2 from the shaded jar. [INFO] Excluding com.google.code.findbugs:jsr305:jar:1.3.9 from the shaded jar. [INFO] Replacing original artifact with shaded artifact. [INFO] Replacing <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib/bkjournal/target/hadoop-hdfs-bkjournal-3.0.0-SNAPSHOT.jar> with <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib/bkjournal/target/hadoop-hdfs-bkjournal-3.0.0-SNAPSHOT-shaded.jar> [INFO] [INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-bkjournal --- [INFO] [INFO] There are 321 checkstyle errors. [WARNING] Unable to locate Source XRef to link to - DISABLED [INFO] [INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-bkjournal --- [INFO] ****** FindBugsMojo execute ******* [INFO] canGenerate is true [INFO] ****** FindBugsMojo executeFindbugs ******* [INFO] Temp File is <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib/bkjournal/target/findbugsTemp.xml> [INFO] Fork Value is true [INFO] xmlOutput is false [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Hadoop HDFS-NFS 3.0.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-nfs --- [INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target> [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs --- [INFO] Executing tasks main: [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test-dir> [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test/data> [INFO] Executed tasks [INFO] [INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-hdfs-nfs --- [INFO] Using default encoding to copy filtered resources. [INFO] [INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-hdfs-nfs --- [INFO] Compiling 12 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/classes> [INFO] [INFO] --- maven-resources-plugin:2.2:testResources (default-testResources) @ hadoop-hdfs-nfs --- [INFO] Using default encoding to copy filtered resources. [INFO] [INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ hadoop-hdfs-nfs --- [INFO] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/test-classes> [INFO] [INFO] --- maven-surefire-plugin:2.12.3:test (default-test) @ hadoop-hdfs-nfs --- [INFO] Surefire report directory: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/surefire-reports> ------------------------------------------------------- T E S T S ------------------------------------------------------- ------------------------------------------------------- T E S T S ------------------------------------------------------- Running org.apache.hadoop.hdfs.nfs.nfs3.TestOffsetRange Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.057 sec Running org.apache.hadoop.hdfs.nfs.nfs3.TestRpcProgramNfs3 Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec Running org.apache.hadoop.hdfs.nfs.nfs3.TestDFSClientCache Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.324 sec Running org.apache.hadoop.hdfs.nfs.TestMountd Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.307 sec Results : Tests run: 8, Failures: 0, Errors: 0, Skipped: 0 [INFO] [INFO] --- maven-jar-plugin:2.3.1:jar (prepare-jar) @ hadoop-hdfs-nfs --- [INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT.jar> [INFO] [INFO] --- maven-jar-plugin:2.3.1:test-jar (prepare-test-jar) @ hadoop-hdfs-nfs --- [INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-tests.jar> [INFO] [INFO] >>> maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs >>> [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] <<< maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs <<< [INFO] [INFO] --- maven-source-plugin:2.1.2:jar (default) @ hadoop-hdfs-nfs --- [INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-sources.jar> [INFO] [INFO] >>> maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs >>> [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] <<< maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs <<< [INFO] [INFO] --- maven-source-plugin:2.1.2:test-jar (default) @ hadoop-hdfs-nfs --- [INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-test-sources.jar> [INFO] [INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default) @ hadoop-hdfs-nfs --- [INFO] ****** FindBugsMojo execute ******* [INFO] canGenerate is true [INFO] ****** FindBugsMojo executeFindbugs ******* [INFO] Temp File is <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/findbugsTemp.xml> [INFO] Fork Value is true [INFO] xmlOutput is false [INFO] [INFO] --- maven-dependency-plugin:2.1:copy (site) @ hadoop-hdfs-nfs --- [INFO] Configured Artifact: jdiff:jdiff:1.0.9:jar [INFO] Configured Artifact: org.apache.hadoop:hadoop-annotations:3.0.0-SNAPSHOT:jar [INFO] Copying jdiff-1.0.9.jar to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/jdiff.jar> [INFO] Copying hadoop-annotations-3.0.0-SNAPSHOT.jar to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-annotations.jar> [INFO] [INFO] --- maven-antrun-plugin:1.6:run (site) @ hadoop-hdfs-nfs --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-antrun-plugin:1.6:run (pre-dist) @ hadoop-hdfs-nfs --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] >>> maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs >>> [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-nfs --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] <<< maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs <<< [INFO] [INFO] --- maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs-nfs --- [INFO] ExcludePrivateAnnotationsStandardDoclet [INFO] [INFO] --- maven-assembly-plugin:2.3:single (dist) @ hadoop-hdfs-nfs --- [WARNING] The following patterns were never triggered in this artifact exclusion filter: o 'org.apache.ant:*:jar' o 'jdiff:jdiff:jar' [INFO] Copying files to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT> [INFO] [INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-hdfs-nfs --- [INFO] [INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-nfs --- [WARNING] Artifact org.apache.hadoop:hadoop-hdfs-nfs:java-source:sources:3.0.0-SNAPSHOT already attached to project, ignoring duplicate [INFO] [INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-nfs --- [WARNING] Artifact org.apache.hadoop:hadoop-hdfs-nfs:java-source:test-sources:3.0.0-SNAPSHOT already attached to project, ignoring duplicate [INFO] [INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-nfs --- [INFO] [INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-nfs --- [INFO] [INFO] --- maven-antrun-plugin:1.6:run (tar) @ hadoop-hdfs-nfs --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO] [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-nfs --- [INFO] ExcludePrivateAnnotationsStandardDoclet [INFO] Building jar: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-3.0.0-SNAPSHOT-javadoc.jar> [INFO] [INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-nfs --- [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available [WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 [INFO] [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project --- [INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target> [INFO] [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project --- [INFO] Executing tasks main: [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir> [INFO] Executed tasks [INFO] [INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project --- [INFO] [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project --- [INFO] Not executing Javadoc as the project is not a Java classpath-capable package [INFO] [INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project --- [INFO] [INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project --- [INFO] ****** FindBugsMojo execute ******* [INFO] canGenerate is false [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop HDFS ................................ SUCCESS [1:40:36.365s] [INFO] Apache Hadoop HttpFS .............................. SUCCESS [2:29.304s] [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [1:20.708s] [INFO] Apache Hadoop HDFS-NFS ............................ FAILURE [25.741s] [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.031s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 1:44:53.003s [INFO] Finished at: Wed Jul 31 13:18:35 UTC 2013 [INFO] Final Memory: 49M/796M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-checkstyle-plugin:2.6:checkstyle (default-cli) on project hadoop-hdfs-nfs: An error has occurred in Checkstyle report generation. Failed during checkstyle execution: Unable to find configuration file at location file://<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml>: Could not find resource 'file://<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-nfs/dev-support/checkstyle.xml'.> -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hadoop-hdfs-nfs Build step 'Execute shell' marked build as failure Archiving artifacts Updating HDFS-5043 Updating YARN-966 Updating HADOOP-9768 Updating HADOOP-9515 Updating MAPREDUCE-5385 Updating YARN-948 Updating HADOOP-9509