On 2/21/10, Dave Korn <dave.korn.cyg...@googlemail.com> wrote:
> On 21/02/2010 20:03, Martin Guy wrote:
>  > The point about defaults is that the GCC default tends to filter down
>  > into the default for distributions;
>   I'd find it surprising if that was really the way it happens; don't
>  distributions make deliberate and conscious decisions about binary standards
>  and things like that?

Changing the default without losing that compatability would assume
that every distro (and there are hundreds of them) either already
specifies a specific arch or that its GCC maintainer notices the
change in GCC and adds explicit configuration options to revert the
change. The big ones with dedicated maintainers for GCC probably
already do that; others just configure and make the standard distro
and take what comes.

On 2/21/10, H.J. Lu <hjl.to...@gmail.com> wrote:
> There is nothing which stops them from using -march=i386. It just may not
>  be the default.
There is: the arch that the libraries in their distro were compiled to run on.

On 2/21/10, Steven Bosscher <stevenb....@gmail.com> wrote:
> On Sun, Feb 21, 2010 at 9:22 PM, Erik Trulsson <ertr1...@student.uu.se> wrote:
>  > One of the great advantages of much free/open software is the way it
>  > will work just fine even on older hardware.
>  And, let's face it, most users of gcc don't use it because it is free
>  software but because it performs just fine for them. And when it does
>  not, they just as easily switch to another compiler.
Hardly. At present there is a GCC monoculture, both in what is the
standard compiler with most systems and in what compiler packages will
build with, either because the build system uses GCC-specific flags or
because the code using GCC extensions.

On 2/21/10, Steven Bosscher <stevenb....@gmail.com> wrote:
> Which brings us back to the discussion of satisfying the needs of a
>  tiny minority while hurting the vast majority of users.
There's a difference in quality between the two. The "hurt" is that
powerful modern PCs might take 20% longer to encode a DVD, while the
"needs" is that the bulk of software will run at all on their poor
hardware.

It's usual in modern societies to give priority to enabling the
underprivileged to function at all over giving the well-off the
maximum of comfort and speed, but how you value the two aspects
probably depends on your personal experience of the two realities.

On 2/21/10, Dave Korn <dave.korn.cyg...@googlemail.com> wrote:
> On 21/02/2010 21:53, Steven Bosscher wrote:
>  > Yes, of course -- but what is the advantage of using the latest GCC
>  > for such an older processor?
>   Tree-SSA?  LTO?  Fixed bugs?  New languages?  Etc?  I can see plenty of good
>  reasons for it.
Apart from those factors (and one hopes that in general all code
generation improves from release to release), users may not really
have a choice, being most likely to try (or be given) the most recent
stable version of whatever distro, and distros tend to try to ship the
most recent stable gcc in each new release.

Let me add another example from my own experience: In 2001 I was stuck
for months in a crumbling house in the countryside with nothing but an
8MB 25MHz 386 because that's all l I had available at the time (green
screen, yay!) and I completed what would have been my postgraduate
degree project, begun in 1985: an unlimited precision floating point
math library in a pure functional language. The fact that I could do
that at all may be due to GCC's "work on the minimum" policy of the
time, both in the distro and on whatever machine David Turner used to
compile the binary-only release of the Miranda interpreter.

If I recall correctly, the default is currently arched and tuned for
486, and the 386's lacks are trapped and emulated in the kernel.

On 2/21/10, Steven Bosscher <stevenb....@gmail.com> wrote:
> Well, as Martin already pointed out (contradicting his own point):
>  Apparently a lot of distributions *do* change the defaults.

That's OK, I don't have The Truth in my pocket. Nor do I have any
quantifiable measure of the number of different systems in use in the
whole world, just a value judgement based on a different set of
experiences of the outcome of restrictive and generous policies in
munimum CPU targetting, which I'm sharing.

My direct experience is that low-end PCs are widely used in societies
where things are hard, and that upstream software developers are
always given the latest, fastest computers to make them more
productive and are unaware of the struggling masses :)

Cheers

  M

Reply via email to