On Sat, 16 Jul 2011 22:56:53 +0200
dexen deVries <dexen.devr...@gmail.com> wrote:

> On Saturday 16 July 2011 21:54:33 erik quanstrom wrote:
> > it's interesting you bring this up.  risc has largely been removed
> > from architectures.  if you tie the instruction set and machine model
> > to the actual hardware, then you need to write new compilers and
> > recompile everything every few years.  instead, the risc is hidden
> > and the instruction set stays the same.  this allows for a lot of
> > under-the-hood innovation in isolation from the architecture.
> 
> 
> interesting angle. till now i believed it's easier to innovate in software 
> (even compilers) than in silicon. where did we go wrong that silicon became 
> the easier way? would it be fair to blame GCC and other heavyweight champions?

Gcc has mutual incompatibilities between different versions of itself, caused 
by its attempts to correctly interpret the heavyweight C standards we have 
today, but I wouldn't say gcc is the big problem. Some of the most essential 
libraries in a Linux system are real bugbears to compile, particularly for a 
new arch.

I'd say it's just part of the ossification of software today. It's become 
extremely rigid and brittle, perhaps even more so in open source than 
commercial contexts.

Reply via email to