> On Oct 5, 2017, at 6:19 PM, allison via cctalk <cctalk@classiccmp.org> wrote:
> 
> Moore's law only worked for hardware, software lagged typically two
> years behind.

There's a more cynical view, sometimes called "the virtual disease", which is 
that software performance is constant because all the gains from Moore's law 
have been consumed by increased software inefficiency and complexity.

A related aspect is an observation from my boss at DEC: interrupts (context 
switches) always take 10 microseconds no matter how fast the CPU is.

Both of these are slightly unfair but much too close to the truth for comfort.

        paul

Reply via email to