See <https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk/493/changes>

Changes:

[nigel] Fix bug in tar file name

[nigel] Add some comments to commitBuild.sh and put artifacts in a single 
directory that can be cleaned up.

[eli] HDFS-1467. Append pipeline never succeeds with more than one replica. 
Contributed by Todd Lipcon

[cos] HDFS-1167. New property for local conf directory in system-test-hdfs.xml 
file. Contributed by Vinay Thota.

------------------------------------------
[...truncated 843 lines...]
A         src/c++/libhdfs/hdfsJniHelper.c
AU        src/c++/libhdfs/Makefile.am
A         src/c++/libhdfs/missing
A         src/c++/libhdfs/hdfs.h
A         src/c++/libhdfs/hdfsJniHelper.h
A         src/c++/libhdfs/aclocal.m4
A         src/c++/libhdfs/install-sh
A         src/docs
A         src/docs/forrest.properties
A         src/docs/status.xml
A         src/docs/src
A         src/docs/src/documentation
A         src/docs/src/documentation/conf
A         src/docs/src/documentation/conf/cli.xconf
A         src/docs/src/documentation/skinconf.xml
A         src/docs/src/documentation/content
A         src/docs/src/documentation/content/xdocs
A         src/docs/src/documentation/content/xdocs/SLG_user_guide.xml
A         src/docs/src/documentation/content/xdocs/hdfs_quota_admin_guide.xml
A         src/docs/src/documentation/content/xdocs/site.xml
A         src/docs/src/documentation/content/xdocs/faultinject_framework.xml
A         src/docs/src/documentation/content/xdocs/hdfsproxy.xml
A         src/docs/src/documentation/content/xdocs/index.xml
A         src/docs/src/documentation/content/xdocs/hdfs_imageviewer.xml
A         src/docs/src/documentation/content/xdocs/tabs.xml
A         src/docs/src/documentation/content/xdocs/libhdfs.xml
A         src/docs/src/documentation/content/xdocs/hdfs_permissions_guide.xml
A         src/docs/src/documentation/content/xdocs/hdfs_design.xml
A         src/docs/src/documentation/content/xdocs/hdfs_user_guide.xml
A         src/docs/src/documentation/resources
A         src/docs/src/documentation/resources/images
AU        src/docs/src/documentation/resources/images/hdfsdatanodes.odg
AU        src/docs/src/documentation/resources/images/request-identify.jpg
AU        src/docs/src/documentation/resources/images/architecture.gif
AU        src/docs/src/documentation/resources/images/hadoop-logo-big.jpg
AU        src/docs/src/documentation/resources/images/hadoop-logo.jpg
AU        src/docs/src/documentation/resources/images/core-logo.gif
AU        src/docs/src/documentation/resources/images/hdfsdatanodes.png
AU        src/docs/src/documentation/resources/images/hdfsarchitecture.gif
AU        src/docs/src/documentation/resources/images/FI-framework.gif
AU        src/docs/src/documentation/resources/images/favicon.ico
AU        src/docs/src/documentation/resources/images/hdfsarchitecture.odg
AU        src/docs/src/documentation/resources/images/FI-framework.odg
AU        src/docs/src/documentation/resources/images/hdfs-logo.jpg
AU        src/docs/src/documentation/resources/images/hdfsproxy-forward.jpg
AU        src/docs/src/documentation/resources/images/hdfsproxy-server.jpg
AU        src/docs/src/documentation/resources/images/hdfsproxy-overview.jpg
AU        src/docs/src/documentation/resources/images/hdfsarchitecture.png
AU        src/docs/src/documentation/resources/images/hdfsdatanodes.gif
A         src/docs/src/documentation/README.txt
A         src/docs/src/documentation/classes
A         src/docs/src/documentation/classes/CatalogManager.properties
A         src/docs/changes
A         src/docs/changes/ChangesFancyStyle.css
AU        src/docs/changes/changes2html.pl
A         src/docs/changes/ChangesSimpleStyle.css
A         src/docs/releasenotes.html
A         bin
A         bin/hdfs-config.sh
AU        bin/start-dfs.sh
AU        bin/stop-balancer.sh
AU        bin/hdfs
A         bin/stop-secure-dns.sh
AU        bin/stop-dfs.sh
AU        bin/start-balancer.sh
A         bin/start-secure-dns.sh
AU        build.xml
 U        .
Fetching 'https://svn.apache.org/repos/asf/hadoop/common/trunk/src/test/bin' at 
-1 into 
'<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk/ws/trunk/src/test/bin'>
AU        src/test/bin/test-patch.sh
At revision 1037129
At revision 1037129
Checking out http://svn.apache.org/repos/asf/hadoop/nightly
A         commitBuild.sh
A         hudsonEnv.sh
AU        hudsonBuildHadoopNightly.sh
AU        hudsonBuildHadoopPatch.sh
AU        hudsonBuildHadoopRelease.sh
AU        processHadoopPatchEmailRemote.sh
AU        hudsonPatchQueueAdmin.sh
AU        processHadoopPatchEmail.sh
A         README.txt
A         test-patch
A         test-patch/test-patch.sh
At revision 1037129
no change for https://svn.apache.org/repos/asf/hadoop/common/trunk/src/test/bin 
since the previous build
[Hadoop-Hdfs-trunk] $ /bin/bash /tmp/hudson4036872645832535124.sh


======================================================================
======================================================================
CLEAN: cleaning workspace
======================================================================
======================================================================


Buildfile: build.xml

clean-contrib:

clean:

check-libhdfs-fuse:

clean:
Trying to override old definition of task macro_tar

clean:
     [echo] contrib: hdfsproxy

clean:
     [echo] contrib: thriftfs

clean-fi:

clean-sign:

clean:

BUILD SUCCESSFUL
Total time: 0 seconds


======================================================================
======================================================================
BUILD: ant clean clean-cache tar findbugs -Dversion=${VERSION} 
-Dtest.junit.output.format=xml -Dtest.output=no -Dcompile.c++=true 
-Dcompile.native=true -Dfindbugs.home=$FINDBUGS_HOME -Djava5.home=$JAVA5_HOME 
-Dforrest.home=$FORREST_HOME -Dclover.home=$CLOVER_HOME 
-Declipse.home=$ECLIPSE_HOME
======================================================================
======================================================================


Buildfile: build.xml

clean-contrib:

clean:

check-libhdfs-fuse:

clean:
Trying to override old definition of task macro_tar

clean:
     [echo] contrib: hdfsproxy

clean:
     [echo] contrib: thriftfs

clean-fi:

clean-sign:

clean:

clean-contrib:

clean:

check-libhdfs-fuse:

clean:
Trying to override old definition of task macro_tar

clean:
     [echo] contrib: hdfsproxy

clean:
     [echo] contrib: thriftfs

clean-fi:

clean-sign:

clean:

clean-cache:
   [delete] Deleting directory /homes/hudson/.ivy2/cache/org.apache.hadoop

clover.setup:

clover.info:

clover:

ivy-download:
      [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk/ws/trunk/ivy/ivy-2.1.0.jar>

ivy-init-dirs:
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk/ws/trunk/build/ivy>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk/ws/trunk/build/ivy/lib>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk/ws/trunk/build/ivy/report>
    [mkdir] Created dir: 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk/ws/trunk/build/ivy/maven>

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:configure] :: loading settings :: file = 
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk/ws/trunk/ivy/ivysettings.xml>

ivy-resolve-common:
[ivy:resolve] downloading 
http://repo1.maven.org/maven2/org/apache/hadoop/avro/1.3.2/avro-1.3.2.jar ...
[ivy:resolve] 
................................................................................................................................................................................................................................
 (331kB)
[ivy:resolve] .. (0kB)
[ivy:resolve] 
[ivy:resolve] :: problems summary ::
[ivy:resolve] :::: WARNINGS
[ivy:resolve]           [FAILED     ] org.apache.hadoop#avro;1.3.2!avro.jar: 
invalid sha1: expected=7b6858e308cb0aee4b565442ef05563c9f62fca1 
computed=da39a3ee5e6b4b0d3255bfef95601890afd80709 (1429ms)
[ivy:resolve]           [FAILED     ] org.apache.hadoop#avro;1.3.2!avro.jar:  
(0ms)
[ivy:resolve]   ==== apache-snapshot: tried
[ivy:resolve]     
https://repository.apache.org/content/repositories/snapshots/org/apache/hadoop/avro/1.3.2/avro-1.3.2.jar
[ivy:resolve]   ==== maven2: tried
[ivy:resolve]     
http://repo1.maven.org/maven2/org/apache/hadoop/avro/1.3.2/avro-1.3.2.jar
[ivy:resolve]           ::::::::::::::::::::::::::::::::::::::::::::::
[ivy:resolve]           ::              FAILED DOWNLOADS            ::
[ivy:resolve]           :: ^ see resolution messages for details  ^ ::
[ivy:resolve]           ::::::::::::::::::::::::::::::::::::::::::::::
[ivy:resolve]           :: org.apache.hadoop#avro;1.3.2!avro.jar
[ivy:resolve]           ::::::::::::::::::::::::::::::::::::::::::::::
[ivy:resolve] 
[ivy:resolve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS

BUILD FAILED
<https://hudson.apache.org/hudson/job/Hadoop-Hdfs-trunk/ws/trunk/build.xml>:1717:
 impossible to resolve dependencies:
        resolve failed - see output for details

Total time: 9 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure

Reply via email to