Mark J. Reed wrote:
Am I the only one having bad flashbacks to Occam, here? (Transputing Will
Change Everything!)
My $0.02, FWIW:
Concurrency is surprising. Humans don't think that way. And programs
aren't written that way - any program represented as a byte stream is
inherently sequential in nature.
...
No, you're not the only person thinking Occam ... though I should point
out that none of my suggestions are "par" blocks -- a par block made
every statement within the block execute in parallel with the other
statements in that block (much like a Verilog fork/join pair).
I disagree with the idea that humans don't think concurrently (though
more often they think in terms of data dependencies). If I ask someone
to boil a kettle + make a cup of tea, and open a can of dogfood to feed
the dog, they'd probably sequence the operations such that the kettle
comes to the boil while they're feeding the dog ... they'll probably put
teabags in the teacups concurrently, too. If you get a group of people
together to achieve some goal (e.g. restore an old building) then the
first thing they'll do is to partition the work so that team members can
work in parallel.
The fact that programs aren't written that way is something that I think
needs to change. Though, if done right, then things probably don't need
to change too much. A program is not a simple byte stream: it's a
random-access array. Multiple cores imply multiple physical program
counters. If we assume that the core count will double every 18 months,
then three years from now we'll be seeing 32-core machines quite
regularly. I wouldn't be surprised to see this grow even faster (Moore's
Law doesn't apply to architectural shifts).