I will (probably foolishly) wade in first:

TL;DR: I think there's been a change in what it means to be a
"long-running system".


Once upon a time, you owned a single expensive hardware box with
_extremely_ limited/slow (by today's standards) RAM, CPU, I/O, and
persistent storage. (Maybe you owned more than one of these boxes, and
could network them, but that was relatively slow, too.)

In that scenario, the "system" is a precious thing you continually
mutate, tinker, and poke in-place. What else could you do?

It's similar to owning a house that you maintain and extend over
months and years -- and you have to live in it during the ongoing
construction.


These days, each computing instance has considerably more resources.
And also: The instances are disposable things you can provision
on-demand. And also: You can even allocate "containers" among one or
more of them.  Each of those containers can be immutable,
reproducible, and so on.

So I think the "system" is now this higher level of resource
allocation. Keeping it long-running doesn't mandate the language level
to be mutable and reflective and tinker-ish.

Instead of one precious house that must be maintained while you live
there -- it's more like you deal in blueprints or designs for a house
factory. Or, maybe it's like AirBnb. Analogies are hard. :)

Related: I think there's been a shift from primacy of the executable
image and hardware, to primacy of source code -- and using version
control as the primary source of truth. Reproducibility is important.
The executable and the thing it runs on are merely discardable
optimizations, like a cache.

-- 
You received this message because you are subscribed to the Google Groups 
"Racket Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to racket-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to