See http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/54/changes

Changes:

[gkesavan] HADOOP-6182. Fix releaseaudit warnings by adding AL Headers.

[acmurthy] HADOOP-6192. Fix Shell.getUlimitMemoryCommand to not rely on 
Map-Reduce specific configs.

------------------------------------------
[...truncated 2931 lines...]
     [exec] checking dependency style of g++... gcc3
     [exec] checking how to run the C++ preprocessor... g++ -E
     [exec] checking for g77... no
     [exec] checking for f77... no
     [exec] checking for xlf... no
     [exec] checking for frt... no
     [exec] checking for pgf77... no
     [exec] checking for fort77... no
     [exec] checking for fl32... no
     [exec] checking for af77... no
     [exec] checking for f90... no
     [exec] checking for xlf90... no
     [exec] checking for pgf90... no
     [exec] checking for epcf90... no
     [exec] checking for f95... no
     [exec] checking for fort... no
     [exec] checking for xlf95... no
     [exec] checking for ifc... no
     [exec] checking for efc... no
     [exec] checking for pgf95... no
     [exec] checking for lf95... no
     [exec] checking for gfortran... no
     [exec] checking whether we are using the GNU Fortran 77 compiler... no
     [exec] checking whether  accepts -g... no
     [exec] checking the maximum length of command line arguments... 32768
     [exec] checking command to parse /usr/bin/nm -B output from gcc object... 
ok
     [exec] checking for objdir... .libs
     [exec] checking for ar... ar
     [exec] checking for ranlib... ranlib
     [exec] checking for strip... strip
     [exec] checking if gcc static flag  works... yes
     [exec] checking if gcc supports -fno-rtti -fno-exceptions... no
     [exec] checking for gcc option to produce PIC... -fPIC
     [exec] checking if gcc PIC flag -fPIC works... yes
     [exec] checking if gcc supports -c -o file.o... yes
     [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) 
supports shared libraries... yes
     [exec] checking whether -lc should be explicitly linked in... no
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] checking if libtool supports shared libraries... yes
     [exec] checking whether to build shared libraries... yes
     [exec] checking whether to build static libraries... yes
     [exec] configure: creating libtool
     [exec] appending configuration tag "CXX" to libtool
     [exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64
     [exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) 
supports shared libraries... yes
     [exec] checking for g++ option to produce PIC... -fPIC
     [exec] checking if g++ PIC flag -fPIC works... yes
     [exec] checking if g++ supports -c -o file.o... yes
     [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) 
supports shared libraries... yes
     [exec] checking dynamic linker characteristics... GNU/Linux ld.so
     [exec] checking how to hardcode library paths into programs... immediate
     [exec] checking whether stripping libraries is possible... yes
     [exec] appending configuration tag "F77" to libtool
     [exec] checking for dlopen in -ldl... yes
     [exec] checking for JNI_GetCreatedJavaVMs in -ljvm... no
     [exec] checking for ANSI C header files... (cached) yes
     [exec] checking stdio.h usability... yes
     [exec] checking stdio.h presence... yes
     [exec] checking for stdio.h... yes
     [exec] checking stddef.h usability... yes
     [exec] checking stddef.h presence... yes
     [exec] checking for stddef.h... yes
     [exec] checking jni.h usability... yes
     [exec] checking jni.h presence... yes
     [exec] checking for jni.h... yes
     [exec] checking zlib.h usability... yes
     [exec] checking zlib.h presence... yes
     [exec] checking for zlib.h... yes
     [exec] checking Checking for the 'actual' dynamic-library for '-lz'... 
"libz.so.1"
     [exec] checking zconf.h usability... yes
     [exec] checking zconf.h presence... yes
     [exec] checking for zconf.h... yes
     [exec] checking Checking for the 'actual' dynamic-library for '-lz'... 
(cached) "libz.so.1"
     [exec] checking for an ANSI C-conforming const... yes
     [exec] checking for memset... yes
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating 
src/org/apache/hadoop/io/compress/zlib/Makefile
     [exec] config.status: creating lib/Makefile
     [exec] config.status: creating config.h
     [exec] config.status: config.h is unchanged
     [exec] config.status: executing depfiles commands
     [exec] make  all-recursive
     [exec] make[1]: Entering directory 
`http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32'
 
     [exec] Making all in src/org/apache/hadoop/io/compress/zlib
     [exec] make[2]: Entering directory 
`http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib'
 
     [exec] if /bin/bash ../../../../../../../libtool --mode=compile --tag=CC 
gcc -DHAVE_CONFIG_H -I. 
-Ihttp://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib
  -I../../../../../../..  -I/home/hudson/tools/java/latest1.6/include 
-I/home/hudson/tools/java/latest1.6/include/linux 
-Ihttp://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src
   -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF 
".deps/ZlibCompressor.Tpo" -c -o ZlibCompressor.lo 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c;
  \
     [exec]     then mv -f ".deps/ZlibCompressor.Tpo" 
".deps/ZlibCompressor.Plo"; else rm -f ".deps/ZlibCompressor.Tpo"; exit 1; fi
     [exec]  gcc -DHAVE_CONFIG_H -I. 
-Ihttp://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib
  -I../../../../../../.. -I/home/hudson/tools/java/latest1.6/include 
-I/home/hudson/tools/java/latest1.6/include/linux 
-Ihttp://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src
  -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF 
.deps/ZlibCompressor.Tpo -c 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c
   -fPIC -DPIC -o .libs/ZlibCompressor.o
     [exec]  gcc -DHAVE_CONFIG_H -I. 
-Ihttp://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib
  -I../../../../../../.. -I/home/hudson/tools/java/latest1.6/include 
-I/home/hudson/tools/java/latest1.6/include/linux 
-Ihttp://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src
  -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF 
.deps/ZlibCompressor.Tpo -c 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c
  -o ZlibCompressor.o >/dev/null 2>&1
     [exec] if /bin/bash ../../../../../../../libtool --mode=compile --tag=CC 
gcc -DHAVE_CONFIG_H -I. 
-Ihttp://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib
  -I../../../../../../..  -I/home/hudson/tools/java/latest1.6/include 
-I/home/hudson/tools/java/latest1.6/include/linux 
-Ihttp://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src
   -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibDecompressor.lo -MD -MP -MF 
".deps/ZlibDecompressor.Tpo" -c -o ZlibDecompressor.lo 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibDecompressor.c;
  \
     [exec]     then mv -f ".deps/ZlibDecompressor.Tpo" 
".deps/ZlibDecompressor.Plo"; else rm -f ".deps/ZlibDecompressor.Tpo"; exit 1; 
fi
     [exec]  gcc -DHAVE_CONFIG_H -I. 
-Ihttp://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib
  -I../../../../../../.. -I/home/hudson/tools/java/latest1.6/include 
-I/home/hudson/tools/java/latest1.6/include/linux 
-Ihttp://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src
  -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibDecompressor.lo -MD -MP -MF 
.deps/ZlibDecompressor.Tpo -c 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibDecompressor.c
   -fPIC -DPIC -o .libs/ZlibDecompressor.o
     [exec]  gcc -DHAVE_CONFIG_H -I. 
-Ihttp://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib
  -I../../../../../../.. -I/home/hudson/tools/java/latest1.6/include 
-I/home/hudson/tools/java/latest1.6/include/linux 
-Ihttp://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src
  -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibDecompressor.lo -MD -MP -MF 
.deps/ZlibDecompressor.Tpo -c 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibDecompressor.c
  -o ZlibDecompressor.o >/dev/null 2>&1
     [exec] /bin/bash ../../../../../../../libtool --mode=link --tag=CC gcc -g 
-Wall -fPIC -O2 -m32 -g -O2 
-L/home/hudson/tools/java/latest1.6/jre/lib/i386/server  -o libnativezlib.la   
ZlibCompressor.lo ZlibDecompressor.lo -ldl -ljvm -ldl 
     [exec] rm -fr  .libs/libnativezlib.a .libs/libnativezlib.la
     [exec] ar cru .libs/libnativezlib.a .libs/ZlibCompressor.o 
.libs/ZlibDecompressor.o
     [exec] ranlib .libs/libnativezlib.a
     [exec] creating libnativezlib.la
     [exec] (cd .libs && rm -f libnativezlib.la && ln -s ../libnativezlib.la 
libnativezlib.la)
     [exec] make[2]: Leaving directory 
`http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/src/org/apache/hadoop/io/compress/zlib'
 
     [exec] Making all in lib
     [exec] make[2]: Entering directory 
`http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib'
 
     [exec] /bin/bash ../libtool --mode=link --tag=CC gcc  -g -O2 
-L/home/hudson/tools/java/latest1.6/jre/lib/i386/server -m32  -o libhadoop.la 
-rpath /usr/local/lib -version-info 1:0:0  
../src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.lo 
../src/org/apache/hadoop/io/compress/zlib/ZlibDecompressor.lo  -ldl -ljvm -ldl 
     [exec] rm -fr  .libs/libhadoop.a .libs/libhadoop.la .libs/libhadoop.lai 
.libs/libhadoop.so .libs/libhadoop.so.1 .libs/libhadoop.so.1.0.0
     [exec] gcc -shared  
../src/org/apache/hadoop/io/compress/zlib/.libs/ZlibCompressor.o 
../src/org/apache/hadoop/io/compress/zlib/.libs/ZlibDecompressor.o  
-L/home/hudson/tools/java/latest1.6/jre/lib/i386/server -ljvm -ldl  -m32 
-Wl,-soname -Wl,libhadoop.so.1 -o .libs/libhadoop.so.1.0.0
     [exec] (cd .libs && rm -f libhadoop.so.1 && ln -s libhadoop.so.1.0.0 
libhadoop.so.1)
     [exec] (cd .libs && rm -f libhadoop.so && ln -s libhadoop.so.1.0.0 
libhadoop.so)
     [exec] ar cru .libs/libhadoop.a  
../src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.o 
../src/org/apache/hadoop/io/compress/zlib/ZlibDecompressor.o
     [exec] ranlib .libs/libhadoop.a
     [exec] creating libhadoop.la
     [exec] (cd .libs && rm -f libhadoop.la && ln -s ../libhadoop.la 
libhadoop.la)
     [exec] make[2]: Leaving directory 
`http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib'
 
     [exec] make[2]: Entering directory 
`http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32'
 
     [exec] make[2]: Leaving directory 
`http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32'
 
     [exec] make[1]: Leaving directory 
`http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32'
 
     [exec] cp 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib/.libs/libhadoop.so.1.0.0
  
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib/libhadoop.so.1.0.0
 
     [exec] (cd 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib
  && { ln -s -f libhadoop.so.1.0.0 libhadoop.so.1 || { rm -f libhadoop.so.1 && 
ln -s libhadoop.so.1.0.0 libhadoop.so.1; }; })
     [exec] (cd 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib
  && { ln -s -f libhadoop.so.1.0.0 libhadoop.so || { rm -f libhadoop.so && ln 
-s libhadoop.so.1.0.0 libhadoop.so; }; })
     [exec] cp 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib/.libs/libhadoop.lai
  
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib/libhadoop.la
 
     [exec] cp 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib/.libs/libhadoop.a
  
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib/libhadoop.a
 
     [exec] ranlib 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib/libhadoop.a
 
     [exec] chmod 644 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/native/Linux-i386-32/lib/libhadoop.a
 
     [exec] libtool: install: warning: remember to run `libtool --finish 
/usr/local/lib'

compile-core:

jar:
      [tar] Nothing to do: 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/classes/bin.tgz
  is up to date.
      [jar] Building jar: 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/hadoop-core-2009-08-14_12-29-40.jar
 

findbugs:
    [mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/test/findbugs
 
 [findbugs] Executing findbugs from ant task
 [findbugs] Running FindBugs...
 [findbugs] Calculating exit code...
 [findbugs] Exit code set to: 0
 [findbugs] Output saved to 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/test/findbugs/hadoop-findbugs-report.xml
 
     [xslt] Processing 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/test/findbugs/hadoop-findbugs-report.xml
  to 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/test/findbugs/hadoop-findbugs-report.html
 
     [xslt] Loading stylesheet 
/home/nigel/tools/findbugs/latest/src/xsl/default.xsl

BUILD SUCCESSFUL
Total time: 2 minutes 27 seconds
+ RESULT=0
+ '[' 0 '!=' 0 ']'
+ mv build/hadoop-core-2009-08-14_12-29-40.tar.gz 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk 
+ mv build/hadoop-core-2009-08-14_12-29-40.jar 
build/hadoop-core-test-2009-08-14_12-29-40.jar 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk 
+ mv build/test/findbugs 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk 
+ mv build/docs/api 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk 
+ /home/hudson/tools/ant/latest/bin/ant clean
Buildfile: build.xml

clean-contrib:

clean:

clean:
     [echo] contrib: failmon
   [delete] Deleting directory 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/contrib/failmon
 

clean:
     [echo] contrib: hod
   [delete] Deleting directory 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/contrib/hod
 

clean:
   [delete] Deleting directory 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build 
   [delete] Deleting directory 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/docs/build
 
   [delete] Deleting directory 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/docs/cn/build
 

BUILD SUCCESSFUL
Total time: 0 seconds
+ /home/hudson/tools/ant/latest/bin/ant -Dversion=2009-08-14_12-29-40 
-Drun.clover=true -Dclover.home=/home/hudson/tools/clover/latest 
-Dpython.home=/home/nigel/tools/python/latest -Dtest.junit.output.format=xml 
-Dtest.output=yes -Dcompile.c++=yes -Dcompile.native=true clover checkstyle 
test generate-clover-reports
Buildfile: build.xml

clover.setup:
    [mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/test/clover/db
 
[clover-setup] Clover Version 2.4.3, built on March 09 2009 (build-756)
[clover-setup] Loaded from: /home/hudson/tools/clover/latest/lib/clover.jar
[clover-setup] Clover: Open Source License registered to Apache.
[clover-setup] Clover is enabled with initstring 
'http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/test/clover/db/hadoop_coverage.db'
 

clover.info:

clover:

ivy-download:
      [get] Getting: 
http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
      [get] To: 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/ivy/ivy-2.0.0-rc2.jar
 
      [get] Not modified - so not downloaded

ivy-init-dirs:
    [mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/ivy
 
    [mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/ivy/lib
 
    [mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/ivy/report
 
    [mkdir] Created dir: 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build/ivy/maven
 

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ 
::
:: loading settings :: file = 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/ivy/ivysettings.xml
 

ivy-resolve-checkstyle:
[ivy:resolve] :: resolving dependencies :: 
org.apache.hadoop#Hadoop-Core;2009-08-14_12-29-40
[ivy:resolve]   confs: [checkstyle]
[ivy:resolve]   found checkstyle#checkstyle;4.2 in maven2
[ivy:resolve]   found antlr#antlr;2.7.6 in maven2
[ivy:resolve]   found commons-beanutils#commons-beanutils-core;1.7.0 in maven2
[ivy:resolve]   found commons-cli#commons-cli;1.0 in maven2
[ivy:resolve]   found commons-lang#commons-lang;1.0 in maven2
[ivy:resolve]   found junit#junit;3.7 in maven2
[ivy:resolve]   found commons-collections#commons-collections;2.1 in maven2
[ivy:resolve]   found commons-logging#commons-logging;1.0.3 in maven2
[ivy:resolve] :: resolution report :: resolve 244ms :: artifacts dl 9ms
[ivy:resolve]   :: evicted modules:
[ivy:resolve]   commons-logging#commons-logging;1.0 by 
[commons-logging#commons-logging;1.0.3] in [checkstyle]
[ivy:resolve]   commons-collections#commons-collections;2.0 by 
[commons-collections#commons-collections;2.1] in [checkstyle]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |    checkstyle    |   10  |   0   |   0   |   2   ||   8   |   0   |
        ---------------------------------------------------------------------

ivy-retrieve-checkstyle:
[ivy:retrieve] :: retrieving :: org.apache.hadoop#Hadoop-Core
[ivy:retrieve]  confs: [checkstyle]
[ivy:retrieve]  8 artifacts copied, 0 already retrieved (1526kB/14ms)
No ivy:settings found for the default reference 'ivy.instance'.  A default 
instance will be used
DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
:: loading settings :: file = 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/ivy/ivysettings.xml
 

check-for-checkstyle:

checkstyle:

BUILD FAILED
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/build.xml
 :592: Unable to create a Checker: unable to read 
http://hudson.zones.apache.org/hudson/job/Hadoop-Common-trunk/ws/trunk/src/test/checkstyle.xml
  - unable to parse configuration stream - The processing instruction target 
matching "[xX][mM][lL]" is not allowed.:17:6

Total time: 1 second
Publishing Javadoc
Recording test results
Recording fingerprints
Publishing Clover coverage report...

Reply via email to