On 24 November 2014 at 18:56, Manuel A. Fernandez Montecelo
<manuel.montez...@gmail.com> wrote:
> 2014-11-23 14:27 Stuart Prescott:
>>
>> Svante Signell wrote:
>>
>>> I wonder how old a package build can be to be part of the release. Some
>>> packages are built up to a year ago, and rebuilding them now FTBFS.
>>
>>
>> As others have noted already, there are period archive rebuilds to check
>> what would now ftbfs.
>>
>> Slightly orthogonal to your question, "up to a year ago" doesn't come
>> close,
>> actually... 42% of source packages in jessie were uploaded over a year
>> ago,
>> 25% over two years ago, 15% over three years ago, 9% over four years ago.
>> Fun fact: there are 64 source packages in jessie that are over 10 years
>> old.
>>
>> Age of source packages in each release:
>>
>>         http://ircbots.debian.net/stats/package_ages.png
>>
>> (note: this is based on source package uploads; binNMUs not included)
>
>
> Not nicely backed with hard data as you have here, but from my random
> observations looking at the state of many packages over many years (in BSPs,
> architecture bootstrapping, etc), I reached similar conclusions.
>
>
> Would it make sense to trigger rebuilds (or binNMUs, or actions to the same
> effect) for all packages in a given architecture when there are important
> changes in the toolchain, e.g. the default GCC bumps to a new major version
> [1]?
>
> For me it makes a lot of sense to have most of the archive rebuilt at that
> time,
> and these newly compiled packages migrated at some point to testing to be
> present in the next release.
>
> With the current method it seems that we only check if it FTBFS, and
> packages
> only get recompiled if there are new uploads or binNMUs.  But actually
> recompiling and moving it to the archive will reveal changes in runtime
> behaviour, and thus bugs might be caught/reported/hopefully-fixed sooner.
> In
> some occasions there might even be some bugs fixed (if the compiler of years
> ago
> was buggy) or performance improvements.
>
> I do not know if there are significant drawbacks, for example if it would be
> very taxing to have everything built in a short period of time (especially
> for
> less powerful architectures) -- not only for hardware, but delays to other
> packages who are uploaded at the same time.  Or if there are other
> fundamental
> problems that I did not consider.
>
> In summary, if there's a cheap (esp. in humanpower) way to achieve this, I
> think
> that it would be worth doing it.
>

Part of OBS (open build service) there is a tool called
"build-compare" which does end comparison to see if things are really
different, it does analyse disassembly to see if things are different.
E.g. I've been fiddling with a toolchain and e.g. enabling AVX by
default did show me all binaries that gained / changed their
instructions because of that and build were kept/publish as modified
instead of discarding them as "identical" to previous.

Imho it would make sense to rebuild locally binaries which are the
same since stable, and compare generated maintainer scripts and/or run
the binaries through build-compare to see if there are worthwhile
differences. And if there are, request release team to binNMU those.
(Or do no change uploads, in case of changes in arch:all packages)

-- 
Regards,

Dimitri.


-- 
To UNSUBSCRIBE, email to debian-devel-requ...@lists.debian.org
with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org
Archive: 
https://lists.debian.org/canbhlugh-kovek8hwf1vzemg-a4iyi9dpbjzo-d-nt2xntk...@mail.gmail.com

Reply via email to