See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1131/changes>

Changes:

[eli] HDFS-3758. TestFuseDFS test failing. Contributed by Colin Patrick McCabe

[tucu] HADOOP-8681. add support for HTTPS to the web UIs. (tucu)

[todd] HDFS-3695. Genericize format() to non-file JournalManagers. Contributed 
by Todd Lipcon.

[todd] HADOOP-8659. Native libraries must build with soft-float ABI for Oracle 
JVM on ARM. Contributed by Trevor Robinson.

[atm] HDFS-3721. hsync support broke wire compatibility. Contributed by Todd 
Lipcon and Aaron T. Myers.

[bobby] YARN-14. Symlinks to peer distributed cache files no longer work (Jason 
Lowe via bobby)

[bobby] MAPREDUCE-3782. teragen terasort jobs fail when using webhdfs:// (Jason 
Lowe via bobby)

[atm] HDFS-3634. Add self-contained, mavenized fuse_dfs test. Contributed by 
Colin Patrick McCabe.

------------------------------------------
[...truncated 12667 lines...]
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-hdfs ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
2 warnings
[WARNING] Javadoc Warnings
[WARNING] 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:32:
 warning: com.sun.org.apache.xml.internal.serialize.OutputFormat is Sun 
proprietary API and may be removed in a future release
[WARNING] import com.sun.org.apache.xml.internal.serialize.OutputFormat;
[WARNING] ^
[WARNING] 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:33:
 warning: com.sun.org.apache.xml.internal.serialize.XMLSerializer is Sun 
proprietary API and may be removed in a future release
[WARNING] import com.sun.org.apache.xml.internal.serialize.XMLSerializer;
[WARNING] ^
[INFO] 
[INFO] --- maven-assembly-plugin:2.3:single (dist) @ hadoop-hdfs ---
[WARNING] The following patterns were never triggered in this artifact 
exclusion filter:
o  'org.apache.ant:*:jar'
o  'jdiff:jdiff:jar'

[INFO] Copying files to 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-3.0.0-SNAPSHOT>
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-hdfs ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ 
hadoop-hdfs ---
[INFO] org already added, skipping
[INFO] org/apache already added, skipping
[INFO] org/apache/hadoop already added, skipping
[INFO] org/apache/hadoop/hdfs already added, skipping
[INFO] org/apache/hadoop/hdfs/server already added, skipping
[INFO] org/apache/hadoop/hdfs/server/namenode already added, skipping
[INFO] org/apache/hadoop/hdfs/server/namenode/ha already added, skipping
[INFO] org/apache/hadoop/hdfs/protocol already added, skipping
[INFO] org/apache/hadoop/hdfs/protocol/proto already added, skipping
[INFO] org already added, skipping
[INFO] org/apache already added, skipping
[INFO] org/apache/hadoop already added, skipping
[INFO] org/apache/hadoop/hdfs already added, skipping
[INFO] org/apache/hadoop/hdfs/server already added, skipping
[INFO] org/apache/hadoop/hdfs/server/datanode already added, skipping
[INFO] org/apache/hadoop/hdfs/server/namenode already added, skipping
[INFO] Building jar: 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-3.0.0-SNAPSHOT-sources.jar>
[INFO] org already added, skipping
[INFO] org/apache already added, skipping
[INFO] org/apache/hadoop already added, skipping
[INFO] org/apache/hadoop/hdfs already added, skipping
[INFO] org/apache/hadoop/hdfs/server already added, skipping
[INFO] org/apache/hadoop/hdfs/server/namenode already added, skipping
[INFO] org/apache/hadoop/hdfs/server/namenode/ha already added, skipping
[INFO] org/apache/hadoop/hdfs/protocol already added, skipping
[INFO] org/apache/hadoop/hdfs/protocol/proto already added, skipping
[INFO] org already added, skipping
[INFO] org/apache already added, skipping
[INFO] org/apache/hadoop already added, skipping
[INFO] org/apache/hadoop/hdfs already added, skipping
[INFO] org/apache/hadoop/hdfs/server already added, skipping
[INFO] org/apache/hadoop/hdfs/server/datanode already added, skipping
[INFO] org/apache/hadoop/hdfs/server/namenode already added, skipping
[WARNING] Artifact 
org.apache.hadoop:hadoop-hdfs:java-source:sources:3.0.0-SNAPSHOT already 
attached to project, ignoring duplicate
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ hadoop-hdfs ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ 
hadoop-hdfs ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (tar) @ hadoop-hdfs ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs ---
[INFO] 
ExcludePrivateAnnotationsStandardDoclet
2 warnings
[WARNING] Javadoc Warnings
[WARNING] 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:32:
 warning: com.sun.org.apache.xml.internal.serialize.OutputFormat is Sun 
proprietary API and may be removed in a future release
[WARNING] import com.sun.org.apache.xml.internal.serialize.OutputFormat;
[WARNING] ^
[WARNING] 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java>:33:
 warning: com.sun.org.apache.xml.internal.serialize.XMLSerializer is Sun 
proprietary API and may be removed in a future release
[WARNING] import com.sun.org.apache.xml.internal.serialize.XMLSerializer;
[WARNING] ^
[INFO] Building jar: 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-3.0.0-SNAPSHOT-javadoc.jar>
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs 
---
[INFO] 
[INFO] There are 7467 checkstyle errors.
[WARNING] Unable to locate Source XRef to link to - DISABLED
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is true
[INFO] ****** FindBugsMojo executeFindbugs *******
[INFO] Temp File is 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/findbugsTemp.xml>
[INFO] Fork Value is true
     [java] Warnings generated: 2
[INFO] xmlOutput is false
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HttpFS 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, 
no dependency information available
[WARNING] Failed to retrieve plugin descriptor for 
org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin 
org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be 
resolved: Failed to read artifact descriptor for 
org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-httpfs 
---
[INFO] Deleting 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-httpfs 
---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-dependency-plugin:2.1:build-classpath (build-classpath) @ 
hadoop-hdfs-httpfs ---
[INFO] Wrote classpath file 
'<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/classes/mrapp-generated-classpath'.>
[INFO] 
[INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ 
hadoop-hdfs-httpfs ---
[INFO] Using default encoding to copy filtered resources.
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ 
hadoop-hdfs-httpfs ---
[INFO] Compiling 56 source files to 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/classes>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-web-xmls) @ hadoop-hdfs-httpfs 
---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/test-classes/webapp>
     [copy] Copying 1 file to 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/test-classes/webapp>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.2:testResources (default-testResources) @ 
hadoop-hdfs-httpfs ---
[INFO] Using default encoding to copy filtered resources.
[INFO] 
[INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ 
hadoop-hdfs-httpfs ---
[INFO] Compiling 46 source files to 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/test-classes>
[INFO] 
[INFO] --- maven-surefire-plugin:2.12:test (default-test) @ hadoop-hdfs-httpfs 
---
[INFO] Surefire report directory: 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/surefire-reports>

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.hadoop.test.TestDirHelper
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.049 sec
Running org.apache.hadoop.test.TestJettyHelper
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.054 sec
Running org.apache.hadoop.test.TestHdfsHelper
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.06 sec
Running org.apache.hadoop.test.TestHTestCase
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.201 sec
Running org.apache.hadoop.test.TestExceptionHelper
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.046 sec
Running org.apache.hadoop.test.TestHFSTestCase
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.883 sec
Running org.apache.hadoop.lib.service.instrumentation.TestInstrumentationService
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.66 sec
Running org.apache.hadoop.lib.service.scheduler.TestSchedulerService
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.212 sec
Running org.apache.hadoop.lib.service.security.TestProxyUserService
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.896 sec
Running org.apache.hadoop.lib.service.security.TestDelegationTokenManagerService
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.722 sec
Running org.apache.hadoop.lib.service.security.TestGroupsService
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.227 sec
Running org.apache.hadoop.lib.service.hadoop.TestFileSystemAccessService
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.082 sec
Running org.apache.hadoop.lib.server.TestServerConstructor
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.114 sec
Running org.apache.hadoop.lib.server.TestServer
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.486 sec
Running org.apache.hadoop.lib.server.TestBaseService
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.269 sec
Running org.apache.hadoop.lib.lang.TestRunnableCallable
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec
Running org.apache.hadoop.lib.lang.TestXException
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec
Running org.apache.hadoop.lib.wsrs.TestParam
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.072 sec
Running org.apache.hadoop.lib.wsrs.TestInputStreamEntity
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.051 sec
Running org.apache.hadoop.lib.wsrs.TestJSONProvider
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.051 sec
Running org.apache.hadoop.lib.wsrs.TestJSONMapProvider
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.052 sec
Running org.apache.hadoop.lib.wsrs.TestUserProvider
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.254 sec
Running org.apache.hadoop.lib.servlet.TestServerWebApp
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.21 sec
Running org.apache.hadoop.lib.servlet.TestMDCFilter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.244 sec
Running org.apache.hadoop.lib.servlet.TestHostnameFilter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.226 sec
Running org.apache.hadoop.lib.util.TestCheck
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.087 sec
Running org.apache.hadoop.lib.util.TestConfigurationUtils
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.127 sec
Running org.apache.hadoop.fs.http.server.TestHttpFSServer
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.378 sec
Running org.apache.hadoop.fs.http.server.TestHttpFSKerberosAuthenticationHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.049 sec
Running org.apache.hadoop.fs.http.server.TestCheckUploadContentTypeFilter
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.27 sec
Running org.apache.hadoop.fs.http.client.TestHttpFSFWithWebhdfsFileSystem
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.764 sec
Running org.apache.hadoop.fs.http.client.TestHttpFSWithHttpFSFileSystem
Tests run: 30, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 19.145 sec <<< 
FAILURE!
Running org.apache.hadoop.fs.http.client.TestHttpFSFileSystemLocalFileSystem
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.473 sec

Results :

Tests in error: 
  
testOperation[0](org.apache.hadoop.fs.http.client.TestHttpFSWithHttpFSFileSystem):
 Address already in use

Tests run: 269, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ SUCCESS 
[1:36:24.736s]
[INFO] Apache Hadoop HttpFS .............................. FAILURE [1:43.379s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:38:08.885s
[INFO] Finished at: Fri Aug 10 13:11:23 UTC 2012
[INFO] Final Memory: 49M/749M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-surefire-plugin:2.12:test (default-test) on 
project hadoop-hdfs-httpfs: There are test failures.
[ERROR] 
[ERROR] Please refer to 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/surefire-reports>
 for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-httpfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Error updating JIRA issues. Saving issues for next build.
JIRA is currently being reindexed. Depending on how large the database is, this 
may take a few minutes. Jira will automatically become available as soon as 
this task is complete.            

Reply via email to