What Clover version does this build use? Apache has an unlimited Clover
License available in the protected committer's svn area:
https://svn.apache.org/repos/private/committers/donated-licenses/clover/2.6.
x

Uwe

-----
Uwe Schindler
H.-H.-Meier-Allee 63, D-28213 Bremen
http://www.thetaphi.de
eMail: u...@thetaphi.de


> -----Original Message-----
> From: Daniel Dai [mailto:da...@hortonworks.com]
> Sent: Thursday, January 12, 2012 11:33 PM
> To: builds@apache.org
> Subject: Can anyone give some hint of how to solve clover license issue?
> 
> I send it couple of days back and get no response. Can someone help?
> 
> Thanks,
> Daniel
> 
> ---------- Forwarded message ----------
> From: Apache Jenkins Server <jenk...@builds.apache.org>
> Date: Thu, Jan 12, 2012 at 2:17 PM
> Subject: Build failed in Jenkins: Pig-trunk #1169
> To: d...@pig.apache.org
> 
> 
> See <https://builds.apache.org/job/Pig-trunk/1169/changes>
> 
> Changes:
> 
> [daijy] PIG-2431: Upgrade bundled hadoop version to 1.0.0
> 
> [dvryaboy] PIG-2468: Speed up TestBuiltin
> 
> ------------------------------------------
> [...truncated 6943 lines...]
>  [findbugs]   org.mozilla.javascript.NativeJavaObject
>  [findbugs]   jline.ConsoleReaderInputStream
>  [findbugs]   org.apache.log4j.PropertyConfigurator
>  [findbugs]   org.apache.hadoop.mapred.TaskID
>  [findbugs]   org.apache.commons.cli.CommandLine
>  [findbugs]   org.python.core.Py
>  [findbugs]   org.apache.hadoop.io.BooleanWritable$Comparator
>  [findbugs]   org.apache.hadoop.io.LongWritable
>  [findbugs]   org.antlr.runtime.BitSet
>  [findbugs]   org.apache.hadoop.mapred.jobcontrol.Job
>  [findbugs]   org.apache.hadoop.hbase.filter.CompareFilter$CompareOp
>  [findbugs]   org.apache.hadoop.io.file.tfile.TFile$Reader
>  [findbugs]   org.mozilla.javascript.NativeFunction
>  [findbugs]   org.apache.hadoop.mapreduce.Counter
>  [findbugs]   org.codehaus.jackson.JsonEncoding
>  [findbugs]   org.codehaus.jackson.JsonParseException
>  [findbugs]   org.python.core.PyCode
>  [findbugs]   com.jcraft.jsch.HostKey
>  [findbugs]   org.apache.hadoop.hbase.filter.Filter
>  [findbugs]   org.apache.commons.logging.Log
>  [findbugs]   com.google.common.util.concurrent.ListenableFuture
>  [findbugs]   org.apache.hadoop.util.RunJar
>  [findbugs]   org.apache.hadoop.mapred.Counters$Group
>  [findbugs]   com.jcraft.jsch.ChannelExec
>  [findbugs]   org.apache.hadoop.hbase.util.Base64
>  [findbugs]   org.antlr.runtime.TokenStream
>  [findbugs]   com.google.common.util.concurrent.CheckedFuture
>  [findbugs]   org.apache.hadoop.io.file.tfile.TFile$Reader$Scanner$Entry
>  [findbugs]   org.apache.hadoop.fs.FSDataInputStream
>  [findbugs]   org.python.core.PyObject
>  [findbugs]   jline.History
>  [findbugs]   org.apache.hadoop.io.BooleanWritable
>  [findbugs]   org.apache.log4j.Logger
>  [findbugs]   org.apache.hadoop.hbase.filter.FamilyFilter
>  [findbugs]   org.antlr.runtime.IntStream
>  [findbugs]   org.apache.hadoop.util.ReflectionUtils
>  [findbugs]   org.apache.hadoop.fs.ContentSummary
>  [findbugs]   org.python.core.PyTuple
>  [findbugs]   org.apache.hadoop.conf.Configuration
>  [findbugs]   org.apache.hadoop.mapreduce.lib.input.FileSplit
>  [findbugs]   org.apache.hadoop.mapred.Counters$Counter
>  [findbugs]   com.jcraft.jsch.Channel
>  [findbugs]   org.apache.hadoop.mapred.JobPriority
>  [findbugs]   org.apache.commons.cli.Options
>  [findbugs]   org.apache.hadoop.mapred.JobID
>  [findbugs]   org.apache.hadoop.util.bloom.BloomFilter
>  [findbugs]   org.python.core.PyFrame
>  [findbugs]   org.apache.hadoop.hbase.filter.CompareFilter
>  [findbugs]   org.apache.hadoop.util.VersionInfo
>  [findbugs]   org.python.core.PyString
>  [findbugs]   org.apache.hadoop.io.Text$Comparator
>  [findbugs]   org.antlr.runtime.MismatchedSetException
>  [findbugs]   org.apache.hadoop.io.BytesWritable
>  [findbugs]   org.apache.hadoop.fs.FsShell
>  [findbugs]   org.mozilla.javascript.ImporterTopLevel
>  [findbugs]   org.apache.hadoop.hbase.mapreduce.TableOutputFormat
>  [findbugs]   org.apache.hadoop.mapred.TaskReport
>  [findbugs]   org.antlr.runtime.tree.RewriteRuleSubtreeStream
>  [findbugs]   org.apache.commons.cli.HelpFormatter
>  [findbugs]   org.mozilla.javascript.NativeObject
>  [findbugs]   org.apache.hadoop.hbase.HConstants
>  [findbugs]   org.apache.hadoop.io.serializer.Deserializer
>  [findbugs]   org.antlr.runtime.FailedPredicateException
>  [findbugs]   org.apache.hadoop.io.compress.CompressionCodec
>  [findbugs]   org.apache.hadoop.fs.FileStatus
>  [findbugs]   org.apache.hadoop.hbase.client.Result
>  [findbugs]   org.apache.hadoop.mapreduce.JobContext
>  [findbugs]   org.codehaus.jackson.JsonGenerator
>  [findbugs]   org.apache.hadoop.mapreduce.TaskAttemptContext
>  [findbugs]   org.apache.hadoop.io.BytesWritable$Comparator
>  [findbugs]   org.apache.hadoop.io.LongWritable$Comparator
>  [findbugs]   org.codehaus.jackson.map.util.LRUMap
>  [findbugs]   org.apache.hadoop.hbase.util.Bytes
>  [findbugs]   org.antlr.runtime.MismatchedTokenException
>  [findbugs]   org.codehaus.jackson.JsonParser
>  [findbugs]   com.jcraft.jsch.UserInfo
>  [findbugs]   org.python.core.PyException
>  [findbugs]   org.apache.commons.cli.ParseException
>  [findbugs]   org.apache.hadoop.io.compress.CompressionOutputStream
>  [findbugs]   org.apache.hadoop.hbase.filter.WritableByteArrayComparable
>  [findbugs]   org.antlr.runtime.tree.CommonTreeNodeStream
>  [findbugs]   org.apache.log4j.Level
>  [findbugs]   org.apache.hadoop.hbase.client.Scan
>  [findbugs]   org.apache.hadoop.mapreduce.Job
>  [findbugs]   com.google.common.util.concurrent.Futures
>  [findbugs]   org.apache.commons.logging.LogFactory
>  [findbugs]   org.apache.commons.codec.binary.Base64
>  [findbugs]   org.codehaus.jackson.map.ObjectMapper
>  [findbugs]   org.apache.hadoop.fs.FileSystem
>  [findbugs]   org.apache.hadoop.hbase.filter.FilterList$Operator
>  [findbugs]   org.apache.hadoop.hbase.io.ImmutableBytesWritable
>  [findbugs]   org.apache.hadoop.io.serializer.SerializationFactory
>  [findbugs]   org.antlr.runtime.tree.TreeAdaptor
>  [findbugs]   org.apache.hadoop.mapred.RunningJob
>  [findbugs]   org.antlr.runtime.CommonTokenStream
>  [findbugs]   org.apache.hadoop.io.DataInputBuffer
>  [findbugs]   org.apache.hadoop.io.file.tfile.TFile
>  [findbugs]   org.apache.commons.cli.GnuParser
>  [findbugs]   org.mozilla.javascript.Context
>  [findbugs]   org.apache.hadoop.io.FloatWritable
>  [findbugs]   org.antlr.runtime.tree.RewriteEarlyExitException
>  [findbugs]   org.apache.hadoop.hbase.HBaseConfiguration
>  [findbugs]   org.codehaus.jackson.JsonGenerationException
>  [findbugs]   org.apache.hadoop.mapreduce.TaskInputOutputContext
>  [findbugs]   org.apache.hadoop.io.compress.GzipCodec
>  [findbugs]   org.apache.hadoop.mapred.jobcontrol.JobControl
>  [findbugs]   org.antlr.runtime.BaseRecognizer
>  [findbugs]   org.apache.hadoop.fs.FileUtil
>  [findbugs]   org.apache.hadoop.fs.Path
>  [findbugs]   org.apache.hadoop.hbase.client.Put
>  [findbugs]   org.apache.hadoop.io.file.tfile.TFile$Writer
>  [findbugs]   jline.ConsoleReader
>  [findbugs]   com.google.common.collect.Lists
>  [findbugs]   org.apache.hadoop.mapreduce.MapContext
>  [findbugs]   org.python.core.PyJavaPackage
>  [findbugs]   org.apache.hadoop.hbase.filter.ColumnPrefixFilter
>  [findbugs]   org.python.core.PyStringMap
>  [findbugs]   org.apache.hadoop.mapreduce.TaskID
>  [findbugs]   org.apache.hadoop.hbase.client.HTable
>  [findbugs]   org.apache.hadoop.io.FloatWritable$Comparator
>  [findbugs]   org.apache.zookeeper.ZooKeeper
>  [findbugs]   org.codehaus.jackson.map.JsonMappingException
>  [findbugs]   org.python.core.PyFunction
>  [findbugs]   org.antlr.runtime.TokenSource
>  [findbugs]   com.jcraft.jsch.ChannelDirectTCPIP
>  [findbugs]   com.jcraft.jsch.JSchException
>  [findbugs]   org.python.util.PythonInterpreter
>  [findbugs]   org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil
>  [findbugs]   org.python.core.PyInteger
>  [findbugs]   org.apache.hadoop.mapred.JobConf
>  [findbugs]   org.apache.hadoop.util.bloom.Key
>  [findbugs]   org.apache.hadoop.io.Text
>  [findbugs]   org.antlr.runtime.NoViableAltException
>  [findbugs]   org.apache.hadoop.util.GenericOptionsParser
>  [findbugs]   org.apache.hadoop.mapreduce.JobID
>  [findbugs]   org.apache.hadoop.mapreduce.TaskAttemptID
>  [findbugs]   org.apache.hadoop.filecache.DistributedCache
>  [findbugs]   org.apache.hadoop.fs.FSDataOutputStream
>  [findbugs]   org.python.core.PyList
>  [findbugs]   org.antlr.runtime.tree.TreeNodeStream
>  [findbugs]   org.apache.hadoop.hbase.filter.BinaryComparator
>  [findbugs]   dk.brics.automaton.RegExp
>  [findbugs]   org.mozilla.javascript.Scriptable
>  [findbugs]   org.mozilla.javascript.EcmaError
>  [findbugs]   org.apache.hadoop.io.serializer.Serializer
>  [findbugs]   org.apache.hadoop.util.bloom.Filter
>  [findbugs]   org.python.core.PyNone
>  [findbugs]   org.mozilla.javascript.Function
>  [findbugs]   org.python.core.PySystemState
>  [findbugs]   org.antlr.runtime.RecognizerSharedState
>  [findbugs]   org.codehaus.jackson.JsonFactory
>  [findbugs]   org.antlr.runtime.EarlyExitException
>  [findbugs]   org.apache.hadoop.hdfs.DistributedFileSystem
>  [findbugs]   org.apache.hadoop.util.LineReader
>  [findbugs] Warnings generated: 18
>  [findbugs] Missing classes: 230
>  [findbugs] Calculating exit code...
>  [findbugs] Setting 'missing class' flag (2)  [findbugs] Setting 'bugs
found' flag
> (1)  [findbugs] Exit code set to: 3  [findbugs] Java Result: 3  [findbugs]
Classes
> needed for analysis were missing  [findbugs] Output saved to <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/findbugs/pig-
> findbugs-report.xml
> >
>     [xslt] Processing <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/findbugs/pig-
> findbugs-report.xml>
> to <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/findbugs/pig-
> findbugs-report.html
> >
>     [xslt] Loading stylesheet
> /home/jenkins/tools/findbugs/latest/src/xsl/default.xsl
> 
> BUILD SUCCESSFUL
> Total time: 7 minutes 55 seconds
> 
> 
> ================================================================
> ======
> ================================================================
> ======
> STORE: saving artifacts
> ================================================================
> ======
> ================================================================
> ======
> 
> 
> 
> 
> ================================================================
> ======
> ================================================================
> ======
> CLEAN: cleaning workspace
> ================================================================
> ======
> ================================================================
> ======
> 
> 
> Buildfile: build.xml
> 
> clean:
>   [delete] Deleting directory <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src-gen>
>   [delete] Deleting directory <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/src/docs/build>
>   [delete] Deleting directory <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/build>
>   [delete] Deleting directory <
> https://builds.apache.org/job/Pig-
> trunk/ws/trunk/test/org/apache/pig/test/utils/dotGraph/parser
> >
>   [delete] Deleting: <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/pig.jar>
>   [delete] Deleting: <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/pig-withouthadoop.jar>
> 
> clean:
> 
> clean:
> 
> BUILD SUCCESSFUL
> Total time: 0 seconds
> 
> 
> ================================================================
> ======
> ================================================================
> ======
> ANALYSIS: ant -Drun.clover=true
> -Dclover.home=/homes/hudson/tools/clover/latest clover test-commit
> generate-clover-reports -Dtest.junit.output.format=xml -Dtest.output=yes -
> Dversion=${BUILD_ID} -Dfindbugs.home=$FINDBUGS_HOME -
> Djava5.home=$JAVA5_HOME -Dforrest.home=$FORREST_HOME -
> Dclover.home=$CLOVER_HOME -Declipse.home=$ECLIPSE_HOME
> ================================================================
> ======
> ================================================================
> ======
> 
> 
> Buildfile: build.xml
> 
> clover.setup:
>    [mkdir] Created dir: <
> https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/clover/db>
> [clover-setup] Clover Version 3.1.0, built on May 31 2011 (build-821)
[clover-
> setup] Loaded from: /home/jenkins/tools/clover/latest/lib/clover.jar
> 
> BUILD FAILED
> java.lang.RuntimeException: Clover upgrades for your license ended
December
> 14 2010, and this version of Clover was built May 31 2011. Please visit
> http://www.atlassian.com/clover/renew for information on upgrading your
> license.
>        at
> com.cenqua.clover.CloverStartup.loadLicense(CloverStartup.java:103)
>        at
com.cenqua.clover.CloverStartup.loadLicense(CloverStartup.java:25)
>        at
>
com.cenqua.clover.tasks.AbstractCloverTask.execute(AbstractCloverTask.java:5
> 2)
>        at
> org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
>        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
> mpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at
>
org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
>        at org.apache.tools.ant.Task.perform(Task.java:348)
>        at org.apache.tools.ant.Target.execute(Target.java:357)
>        at org.apache.tools.ant.Target.performTasks(Target.java:385)
>        at
> org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
>        at org.apache.tools.ant.Project.executeTarget(Project.java:1306)
>        at
>
org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.j
a
> va:41)
>        at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
>        at org.apache.tools.ant.Main.runBuild(Main.java:758)
>        at org.apache.tools.ant.Main.startAnt(Main.java:217)
>        at org.apache.tools.ant.launch.Launcher.run(Launcher.java:257)
>        at org.apache.tools.ant.launch.Launcher.main(Launcher.java:104)
> 
> Total time: 1 second
> Build step 'Execute shell' marked build as failure [FINDBUGS] Skipping
publisher
> since build result is FAILURE Recording test results Publishing Javadoc
Archiving
> artifacts Recording fingerprints Publishing Clover coverage report...
> No Clover report will be published due to a Build Failure

Reply via email to