Yeah, fwiw, any build that doesn't have a build.xml (and therefore can't be
loaded by Jenkins/displayed in the UI/etc) or any job directory that
doesn't have a config.xml should just be rm'd. That's just eating disk
space with no way of being used.

A.

On Wed, Apr 25, 2018 at 11:07 AM, Chris Lambertus <c...@apache.org> wrote:

>
>
> > On Apr 25, 2018, at 7:49 AM, Allen Wittenauer <a...@effectivemachines.com>
> wrote:
> >
> >> Using the yetus jobs as a reference, yetus-java builds 480 and 481 are
> nearly a year old, but only contain a few kilobytes of data. While removing
> them saves no space, they also provide no value,
> >
> >       … to infra.
> >
> >       The value to the communities that any job services is really up to
> those communities to decide.
>
>
> I mean builds 480 and 481 literally provide no value, since they are not
> accessible via the jenkins UI and only contain a polling.log file from
> 2017. I recognize that some projects may want to retain specific older
> builds, but I personally question the utility of data kept older than 6
> months. For build data that’s significantly outside the nominal 30 day / 10
> job window, we’d ask that this be exported and managed locally by the
> project rather than remaining “live” on the master. Is there a reason other
> than convenience for it to remain live?
>
> Just based on some initial review, excluding the build logs and job
> metadata is probably do-able for an initial pass at purging old data, but
> I’ll want to generate some data on how many old build histories exist. As I
> stated earlier the main goal here is to remove dead jobs and binary
> artifacts. There do appear to be a fair few jobs which no longer exist as
> mentioned else-thread, so hopefully we’ll realize some notable performance
> and space improvements by culling the low hanging fruit before a more
> drastic approach is required.
>
> -Chris
>
>

Reply via email to