See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/284/changes>

Changes:

[xyao] HADOOP-12347. Fix mismatch parameter name in javadocs of 
AuthToken#setMaxInactives. Contributed by Xiaoyu Yao

------------------------------------------
[...truncated 7146 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 82.904 sec - 
in org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencingWithReplication
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 69.367 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencingWithReplication
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestDFSUpgradeWithHA
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.803 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestDFSUpgradeWithHA
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestEditLogsDuringFailover
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.916 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestEditLogsDuringFailover
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestEditLogTailer
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.153 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestEditLogTailer
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestPendingCorruptDnMessages
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.335 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestPendingCorruptDnMessages
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestPipelinesFailover
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 101.982 sec - 
in org.apache.hadoop.hdfs.server.namenode.ha.TestPipelinesFailover
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestQuotasWithHA
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.987 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestQuotasWithHA
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestGetGroupsWithHA
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.431 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestGetGroupsWithHA
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestNNHealthCheck
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.959 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestNNHealthCheck
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestFailureToReadEdits
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 87.497 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestFailureToReadEdits
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestFailureOfSharedDir
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.181 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestFailureOfSharedDir
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running 
org.apache.hadoop.hdfs.server.namenode.ha.TestLossyRetryInvocationHandler
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.116 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestLossyRetryInvocationHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestHAFsck
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.865 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestHAFsck
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestBootstrapStandby
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.045 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestBootstrapStandby
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencing
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 44.47 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestDNFencing
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestHAAppend
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.698 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestHAAppend
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestStandbyIsHot
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.126 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestStandbyIsHot
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestRetryCacheWithHA
Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 99.63 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestRetryCacheWithHA
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestHAStateTransitions
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 65.948 sec - 
in org.apache.hadoop.hdfs.server.namenode.ha.TestHAStateTransitions
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestBootstrapStandbyWithQJM
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.315 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestBootstrapStandbyWithQJM
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestRemoteNameNodeInfo
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.625 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestRemoteNameNodeInfo
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestStandbyCheckpoints
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 172.62 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestStandbyCheckpoints
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestHAConfiguration
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.311 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestHAConfiguration
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestHAMetrics
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.968 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestHAMetrics
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running 
org.apache.hadoop.hdfs.server.namenode.ha.TestRequestHedgingProxyProvider
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.08 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestRequestHedgingProxyProvider
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestStateTransitionFailure
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.153 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestStateTransitionFailure
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestStandbyBlockManagement
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.831 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestStandbyBlockManagement
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestDelegationTokensWithHA
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.106 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestDelegationTokensWithHA
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running 
org.apache.hadoop.hdfs.server.namenode.ha.TestFailoverWithBlockTokensEnabled
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.846 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestFailoverWithBlockTokensEnabled
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes
Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 133.953 sec <<< 
FAILURE! - in org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes
testCircularLinkedListWrites(org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes)
  Time elapsed: 133.765 sec  <<< FAILURE!
java.lang.AssertionError: Some writers didn't complete in expected runtime! 
Current writer state:[Circular Writer:
         directory: /test-0
         target length: 50
         current item: 48
         done: false
, Circular Writer:
         directory: /test-1
         target length: 50
         current item: 27
         done: false
] expected:<0> but was:<2>
        at org.junit.Assert.fail(Assert.java:88)
        at org.junit.Assert.failNotEquals(Assert.java:743)
        at org.junit.Assert.assertEquals(Assert.java:118)
        at org.junit.Assert.assertEquals(Assert.java:555)
        at 
org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes.testCircularLinkedListWrites(TestSeveralNameNodes.java:90)

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestHarFileSystemWithHA
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.199 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestHarFileSystemWithHA
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestXAttrsWithHA
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.75 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestXAttrsWithHA
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.ha.TestInitializeSharedEdits
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.091 sec - in 
org.apache.hadoop.hdfs.server.namenode.ha.TestInitializeSharedEdits
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.TestDefaultBlockPlacementPolicy
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.581 sec - in 
org.apache.hadoop.hdfs.server.namenode.TestDefaultBlockPlacementPolicy
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.TestDecommissioningStatus
Tests run: 3, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 13.295 sec <<< 
FAILURE! - in org.apache.hadoop.hdfs.server.namenode.TestDecommissioningStatus
testDecommissionStatus(org.apache.hadoop.hdfs.server.namenode.TestDecommissioningStatus)
  Time elapsed: 0.111 sec  <<< FAILURE!
java.lang.AssertionError: Unexpected num under-replicated blocks expected:<3> 
but was:<4>
        at org.junit.Assert.fail(Assert.java:88)
        at org.junit.Assert.failNotEquals(Assert.java:743)
        at org.junit.Assert.assertEquals(Assert.java:118)
        at org.junit.Assert.assertEquals(Assert.java:555)
        at 
org.apache.hadoop.hdfs.server.namenode.TestDecommissioningStatus.checkDecommissionStatus(TestDecommissioningStatus.java:196)
        at 
org.apache.hadoop.hdfs.server.namenode.TestDecommissioningStatus.testDecommissionStatus(TestDecommissioningStatus.java:291)

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.TestINodeAttributeProvider
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.097 sec - in 
org.apache.hadoop.hdfs.server.namenode.TestINodeAttributeProvider
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; 
support was removed in 8.0
Running org.apache.hadoop.hdfs.server.namenode.metrics.TestNameNodeMetrics

Results :

Failed tests: 
  
TestParallelUnixDomainRead>TestParallelReadUtil.testParallelNoChecksums:421->TestParallelReadUtil.runTestWorkload:382
 Check log for errors
  
TestParallelUnixDomainRead>TestParallelReadUtil.testParallelReadCopying:405->TestParallelReadUtil.runTestWorkload:382
 Check log for errors
  TestCacheDirectives.testExceedsCapacity:1502->checkPendingCachedEmpty:1479 
Pending cached list of 127.0.0.1:51581 is not empty, [{blockId=1073741841, 
replication=1, mark=false}]
  TestSeveralNameNodes.testCircularLinkedListWrites:90 Some writers didn't 
complete in expected runtime! Current writer state:[Circular Writer:
         directory: /test-0
         target length: 50
         current item: 48
         done: false
, Circular Writer:
         directory: /test-1
         target length: 50
         current item: 27
         done: false
] expected:<0> but was:<2>
  
TestDecommissioningStatus.testDecommissionStatus:291->checkDecommissionStatus:196
 Unexpected num under-replicated blocks expected:<3> but was:<4>

Tests in error: 
  
TestParallelUnixDomainRead.teardownCluster:58->TestParallelReadUtil.teardownCluster:393
 ยป NoClassDefFound

Tests run: 2488, Failures: 5, Errors: 1, Skipped: 12

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project 
---
[INFO] Deleting 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project 
---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ 
hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable 
package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project 
---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ 
hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [03:01 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  01:40 h]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.307 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:43 h
[INFO] Finished at: 2015-08-22T01:33:28+00:00
[INFO] Final Memory: 75M/1005M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on 
project hadoop-hdfs: ExecutionException: java.lang.RuntimeException: The forked 
VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs>
 && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx4096m 
-XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter3977008060939306852.jar>
 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire461518326222278232tmp>
 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_1513677381281568530447tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Sending artifact delta relative to Hadoop-Hdfs-trunk-Java8 #222
Archived 1 artifacts
Archive block size is 32768
Received 0 blocks and 4390367 bytes
Compression is 0.0%
Took 4 sec
Recording test results
Updating HADOOP-12347

Reply via email to