See <https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/467/changes>

Changes:

[hairong] HDFS-1482. Add listCorruptFileBlocks to DistributedFileSystem. 
Contributed by Patrick Kling.

------------------------------------------
[...truncated 901 lines...]
A         bin/hdfs-config.sh
AU        bin/start-dfs.sh
AU        bin/stop-balancer.sh
AU        bin/hdfs
A         bin/stop-secure-dns.sh
AU        bin/stop-dfs.sh
AU        bin/start-balancer.sh
A         bin/start-secure-dns.sh
AU        build.xml
 U        .
Fetching 'https://svn.apache.org/repos/asf/hadoop/common/trunk/src/test/bin' at 
-1 into 
'<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/src/test/bin'>
AU        src/test/bin/test-patch.sh
At revision 1038227
At revision 1038226
Checking out http://svn.apache.org/repos/asf/hadoop/nightly
A         commitBuild.sh
A         hudsonEnv.sh
AU        hudsonBuildHadoopNightly.sh
AU        hudsonBuildHadoopPatch.sh
AU        hudsonBuildHadoopRelease.sh
AU        processHadoopPatchEmailRemote.sh
AU        hudsonPatchQueueAdmin.sh
AU        processHadoopPatchEmail.sh
A         README.txt
A         test-patch
A         test-patch/test-patch.sh
At revision 1038227
no change for http://svn.apache.org/repos/asf/hadoop/nightly since the previous 
build
no change for https://svn.apache.org/repos/asf/hadoop/common/trunk/src/test/bin 
since the previous build
[Hadoop-Hdfs-trunk-Commit] $ /bin/bash /tmp/hudson873403136010013923.sh


======================================================================
======================================================================
CLEAN: cleaning workspace
======================================================================
======================================================================


Buildfile: build.xml

clean-contrib:

clean:

check-libhdfs-fuse:

clean:
Trying to override old definition of task macro_tar

clean:
     [echo] contrib: hdfsproxy

clean:
     [echo] contrib: thriftfs

clean-fi:

clean-sign:

clean:

BUILD SUCCESSFUL
Total time: 0 seconds


======================================================================
======================================================================
BUILD: ant veryclean mvn-deploy tar findbugs -Dtest.junit.output.format=xml 
-Dtest.output=yes -Dcompile.c++=true -Dcompile.native=true 
-Dfindbugs.home=$FINDBUGS_HOME -Djava5.home=$JAVA5_HOME 
-Dforrest.home=$FORREST_HOME -Dclover.home=$CLOVER_HOME 
-Declipse.home=$ECLIPSE_HOME
======================================================================
======================================================================


Buildfile: build.xml

clean-contrib:

clean:

check-libhdfs-fuse:

clean:
Trying to override old definition of task macro_tar

clean:
     [echo] contrib: hdfsproxy

clean:
     [echo] contrib: thriftfs

clean-fi:

clean-sign:

clean:

clean-cache:
   [delete] Deleting directory /homes/hudson/.ivy2/cache/org.apache.hadoop

veryclean:

ant-task-download:
      [get] Getting: 
http://repo2.maven.org/maven2/org/apache/maven/maven-ant-tasks/2.0.10/maven-ant-tasks-2.0.10.jar
      [get] To: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/ivy/maven-ant-tasks-2.0.10.jar>

mvn-taskdef:

clover.setup:

clover.info:

clover:

ivy-download:
      [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/ivy/ivy-2.1.0.jar>

ivy-init-dirs:
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/ivy>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/ivy/lib>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/ivy/report>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/ivy/maven>

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:configure] :: loading settings :: file = 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] downloading 
https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/hadoop-common/0.22.0-SNAPSHOT/hadoop-common-0.22.0-20101119.063222-143.jar
 ...
[ivy:resolve] 
......................................................................................................................................................................................................
 (1339kB)
[ivy:resolve] .. (0kB)
[ivy:resolve]   [SUCCESSFUL ] 
org.apache.hadoop#hadoop-common;0.22.0-SNAPSHOT!hadoop-common.jar (551ms)
[ivy:resolve] downloading 
http://repo1.maven.org/maven2/org/apache/hadoop/avro/1.3.2/avro-1.3.2.jar ...
[ivy:resolve] 
...................................................................................................................................................................................................................................
 (331kB)
[ivy:resolve] .. (0kB)
[ivy:resolve]   [SUCCESSFUL ] org.apache.hadoop#avro;1.3.2!avro.jar (1508ms)

ivy-retrieve-common:
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 
'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/ivy/ivysettings.xml>

init:
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/classes>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/src>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/webapps/hdfs/WEB-INF>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/webapps/datanode/WEB-INF>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/webapps/secondary/WEB-INF>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/ant>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/c++>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/test>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/test/hdfs/classes>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/test/extraconf>
    [touch] Creating /tmp/null2077227495
   [delete] Deleting: /tmp/null2077227495
     [copy] Copying 3 files to 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/webapps>
     [copy] Copying 1 file to 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/conf>
     [copy] Copying 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/conf/hdfs-site.xml.template>
 to 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/conf/hdfs-site.xml>

compile-hdfs-classes:
    [javac] Compiling 207 source files to 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build/classes>
    [javac] 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/hdfs/DFSClient.java>:64:
 cannot find symbol
    [javac] symbol  : class CorruptFileBlocks
    [javac] location: package org.apache.hadoop.fs
    [javac] import org.apache.hadoop.fs.CorruptFileBlocks;
    [javac]                            ^
    [javac] 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/fs/Hdfs.java>:310:
 cannot find symbol
    [javac] symbol  : class CorruptFileBlocks
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public CorruptFileBlocks listCorruptFileBlocks(String path,
    [javac]          ^
    [javac] 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/hdfs/protocol/ClientProtocol.java>:35:
 cannot find symbol
    [javac] symbol  : class CorruptFileBlocks
    [javac] location: package org.apache.hadoop.fs
    [javac] import org.apache.hadoop.fs.CorruptFileBlocks;
    [javac]                            ^
    [javac] 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/hdfs/DFSClient.java>:1124:
 cannot find symbol
    [javac] symbol  : class CorruptFileBlocks
    [javac] location: class org.apache.hadoop.hdfs.DFSClient
    [javac]   public CorruptFileBlocks listCorruptFileBlocks(String path,
    [javac]          ^
    [javac] 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/hdfs/protocol/ClientProtocol.java>:671:
 cannot find symbol
    [javac] symbol  : class CorruptFileBlocks
    [javac] location: interface org.apache.hadoop.hdfs.protocol.ClientProtocol
    [javac]   public CorruptFileBlocks
    [javac]          ^
    [javac] 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/hdfs/DistributedFileSystem.java>:46:
 cannot find symbol
    [javac] symbol  : class CorruptFileBlocks
    [javac] location: package org.apache.hadoop.fs
    [javac] import org.apache.hadoop.fs.CorruptFileBlocks;
    [javac]                            ^
    [javac] 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/hdfs/DistributedFileSystem.java>:609:
 cannot find symbol
    [javac] symbol  : class CorruptFileBlocks
    [javac] location: class org.apache.hadoop.hdfs.DistributedFileSystem
    [javac]   public CorruptFileBlocks listCorruptFileBlocks(String path,
    [javac]          ^
    [javac] 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/NameNode.java>:46:
 cannot find symbol
    [javac] symbol  : class CorruptFileBlocks
    [javac] location: package org.apache.hadoop.fs
    [javac] import org.apache.hadoop.fs.CorruptFileBlocks;
    [javac]                            ^
    [javac] 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/NameNode.java>:1132:
 cannot find symbol
    [javac] symbol  : class CorruptFileBlocks
    [javac] location: class org.apache.hadoop.hdfs.server.namenode.NameNode
    [javac]   public CorruptFileBlocks
    [javac]          ^
    [javac] 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/fs/Hdfs.java>:309:
 method does not override or implement a method from a supertype
    [javac]   @Override
    [javac]   ^
    [javac] 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/hdfs/DistributedFileSystem.java>:608:
 method does not override or implement a method from a supertype
    [javac]   @Override
    [javac]   ^
    [javac] 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/NameNode.java>:1145:
 cannot find symbol
    [javac] symbol  : class CorruptFileBlocks
    [javac] location: class org.apache.hadoop.hdfs.server.namenode.NameNode
    [javac]     return new CorruptFileBlocks(files, lastCookie);
    [javac]                ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] 12 errors

BUILD FAILED
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk-Commit/ws/trunk/build.xml>:336:
 Compile failed; see the compiler error output for details.

Total time: 14 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure

Reply via email to