See https://builds.apache.org/hudson/job/Hadoop-Hdfs-22-branch/55/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE 
###########################
[...truncated 3083 lines...]
    [mkdir] Created dir: 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
     [copy] Copying 1 file to 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
     [copy] Copying 1 file to 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
     [copy] Copying 1 file to 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
     [copy] Copying 1 file to 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
     [copy] Copying 1 file to 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
     [copy] Copying 1 file to 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
     [copy] Copying 1 file to 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
     [copy] Copying 1 file to 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
     [copy] Copying 1 file to 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache
     [copy] Copying 1 file to 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/cache

run-test-hdfs-excluding-commit-and-smoke:
    [mkdir] Created dir: 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/data
    [mkdir] Created dir: 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/logs
     [copy] Copying 1 file to 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/extraconf
     [copy] Copying 1 file to 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/extraconf
    [junit] WARNING: multiple versions of ant detected in path for junit 
    [junit]          
jar:file:/homes/hudson/tools/ant/latest/lib/ant.jar!/org/apache/tools/ant/Project.class
    [junit]      and 
jar:file:/homes/hudson/.ivy2/cache/ant/ant/jars/ant-1.6.5.jar!/org/apache/tools/ant/Project.class
    [junit] Running org.apache.hadoop.fs.TestFiListPath
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 2.239 sec
    [junit] Running org.apache.hadoop.fs.TestFiRename
    [junit] Tests run: 4, Failures: 0, Errors: 3, Time elapsed: 4.457 sec
    [junit] Test org.apache.hadoop.fs.TestFiRename FAILED
    [junit] Running org.apache.hadoop.hdfs.TestFiHFlush
    [junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 16.004 sec
    [junit] Running org.apache.hadoop.hdfs.TestFiHftp
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 42.429 sec
    [junit] Running org.apache.hadoop.hdfs.TestFiPipelines
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.155 sec
    [junit] Running 
org.apache.hadoop.hdfs.server.datanode.TestFiDataTransferProtocol
    [junit] Tests run: 29, Failures: 0, Errors: 0, Time elapsed: 211.637 sec
    [junit] Running 
org.apache.hadoop.hdfs.server.datanode.TestFiDataTransferProtocol2
    [junit] Tests run: 10, Failures: 0, Errors: 0, Time elapsed: 295.755 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestFiPipelineClose
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 35.73 sec

checkfailure:
    [touch] Creating 
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build-fi/test/testsfailed

BUILD FAILED
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:744:
 The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:503:
 The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/src/test/aop/build/aop.xml:230:
 The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:685:
 The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:659:
 The following error occurred while executing this line:
/grid/0/hudson/hudson-slave/workspace/Hadoop-Hdfs-22-branch/trunk/build.xml:727:
 Tests failed!

Total time: 92 minutes 43 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) 
##############################
5 tests failed.
REGRESSION:  org.apache.hadoop.fs.TestFiRename.testFailureNonExistentDst

Error Message:
Internal error: default blockSize is not a multiple of default bytesPerChecksum 

Stack Trace:
java.io.IOException: Internal error: default blockSize is not a multiple of 
default bytesPerChecksum 
        at 
org.apache.hadoop.fs.AbstractFileSystem.create(AbstractFileSystem.java:474)
        at org.apache.hadoop.fs.FileContext$2.next(FileContext.java:575)
        at org.apache.hadoop.fs.FileContext$2.next(FileContext.java:572)
        at 
org.apache.hadoop.fs.FileContext$FSLinkResolver.resolve(FileContext.java:2196)
        at org.apache.hadoop.fs.FileContext.create(FileContext.java:572)
        at org.apache.hadoop.fs.TestFiRename.createFile(TestFiRename.java:140)
        at 
org.apache.hadoop.fs.TestFiRename.testFailureNonExistentDst(TestFiRename.java:151)


REGRESSION:  org.apache.hadoop.fs.TestFiRename.testFailuresExistingDst

Error Message:
Internal error: default blockSize is not a multiple of default bytesPerChecksum 

Stack Trace:
java.io.IOException: Internal error: default blockSize is not a multiple of 
default bytesPerChecksum 
        at 
org.apache.hadoop.fs.AbstractFileSystem.create(AbstractFileSystem.java:474)
        at org.apache.hadoop.fs.FileContext$2.next(FileContext.java:575)
        at org.apache.hadoop.fs.FileContext$2.next(FileContext.java:572)
        at 
org.apache.hadoop.fs.FileContext$FSLinkResolver.resolve(FileContext.java:2196)
        at org.apache.hadoop.fs.FileContext.create(FileContext.java:572)
        at org.apache.hadoop.fs.TestFiRename.createFile(TestFiRename.java:140)
        at 
org.apache.hadoop.fs.TestFiRename.testFailuresExistingDst(TestFiRename.java:167)


REGRESSION:  org.apache.hadoop.fs.TestFiRename.testDeletionOfDstFile

Error Message:
Internal error: default blockSize is not a multiple of default bytesPerChecksum 

Stack Trace:
java.io.IOException: Internal error: default blockSize is not a multiple of 
default bytesPerChecksum 
        at 
org.apache.hadoop.fs.AbstractFileSystem.create(AbstractFileSystem.java:474)
        at org.apache.hadoop.fs.FileContext$2.next(FileContext.java:575)
        at org.apache.hadoop.fs.FileContext$2.next(FileContext.java:572)
        at 
org.apache.hadoop.fs.FileContext$FSLinkResolver.resolve(FileContext.java:2196)
        at org.apache.hadoop.fs.FileContext.create(FileContext.java:572)
        at org.apache.hadoop.fs.TestFiRename.createFile(TestFiRename.java:140)
        at 
org.apache.hadoop.fs.TestFiRename.testDeletionOfDstFile(TestFiRename.java:188)


FAILED:  org.apache.hadoop.cli.TestHDFSCLI.testAll

Error Message:
Timeout occurred. Please note the time in the report does not reflect the time 
until the timeout.

Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in 
the report does not reflect the time until the timeout.


FAILED:  
org.apache.hadoop.hdfs.server.datanode.TestBlockRecovery.testErrorReplicas

Error Message:
Timeout occurred. Please note the time in the report does not reflect the time 
until the timeout.

Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in 
the report does not reflect the time until the timeout.



Reply via email to