Did Verlinde draw the scrabble metaphor? Or was it just Overbye, the
NYT author?
From the paper, it's not clear to me that Verlinde was implying that
the straightened out (minimum entropy config?) polymer molecule was
special in any anthropomorphic sense, just that it was special in the
sense of the elastic force.
I have two questions about your distinction between "absence of
arrangement" and "uncertainty". Isn't all uncertainty evaluated in the
context of a particular conceptual target? Similarly, wouldn't any
judgment of arrangement or absence thereof be obtained in the context of
a particular conceptual target? E.g. Bob can estimate his uncertainty
of guessing the arrangement of scrabble chips, as arranged by Joe in
another room. And that estimation, were Bob rational and if he thought
Joe were rational, would involve some consideration of the physical
characteristics of the scrabble chips (tiled plane vs. tiled line,
characters upside right vs. upside down or sideways, etc.)?
And, if so, then regardless of _which_ particular target configuration
(of polymers or scrabble chips) has the [at|in]tention of the predictor,
we can estimate the uncertainty associated with each configuration based
on its probability of obtaining, including all factors, some purely
physical, some psychological, etc?
These are rhetorical questions; but I wouldn't mind answers to them
anyway because I'm not convinced of my own conclusion that, although
arrangement and uncertainty _can_ be distinguished they are not independent.
Grant Holland wrote circa 10-07-14 04:38 AM:
Verlinde makes the same unfortunate argument that is made by scores of
scientists - even noted thermodynamicsists - about so-called "disorder":
namely that certain permutations are "disordered", while other
permutations are not. To wit:
"Think of the universe as a box of scrabble letters. There is only one
way to have the letters arranged to spell out the Gettysburg Address,
but an astronomical number of ways to have them spell nonsense. Shake
the box and it will tend toward nonsense, disorder will increase and
information will be lost as the letters shuffle toward their most
probable configurations. Could this be gravity?"
I find this argument specious.
Just because, from an anthropomorphic, English-speaking bias, he finds
the Gettysburg address "more ordered" than any other permutation of the
same length - it is not. They are all permutation of the same number of
letters. Each is as well-defined, and well-ordered, as the other.
Anyway, "order" is an ill-defined, conflated term within the discussion
of thermodynamics. It enjoys two distinct usages that get oft-conflated
in the conversation regarding entropy. One usage is that it means
"disorganization", "absence of arrangement", "dispersed", etc. This is
approximately the meaning had originally by R. Clausius. The other usage
is that of "uncertainty" or "unpredictability". This is the meaning had
by Shannon. "Disorganized" and "uncertain" do not mean the same thing. I
can prove this because they can vary independently - and, the same
phenomenon can exhibit one without the other - the Organized state can
sometimes be Uncertain...
In between the meanings of Clausius and Shannon are the meanings of
entropy put forth by Boltzmann and Gibbs. Those meanings are often taken
to be about "disorganization", but they are actually about
"uncertainty". They involve probabilities. So, there is much confusion
within statistical thermodynamics about "entropy", because the
conversation often assumes that "disorder" is about "disorganization",
when it is actually about "unpredictability". Certainly, it is confusing
since Clausius was all about "dispersion", "disorganization", while
these other two physicists, Boltzmann and Gibbs, were actually about
"uncertainty".
On the other hand, Shannon was not behaving as a physicist, when he
"borrowed" the word "entropy" (upon the insistence of von Neumann) for
his measure of uncertainty. Indeed, he even "borrowed" most of his
formula from Gibbs. However, with his definition of entropy, Gibbs ( and
Boltzmann before him) was doing physics - he was describing a specific
physical phenomenon.
On the other hand, Shannon was not doing physics. Rather he was doing
mathematical statistics. His definition of entropy is a mathematical
function whose domain space is probability distributions (to use the
term loosely). With Shannon's entropy, any probability distribution now
has a "measure of unpredictability". Some PDFs have more
unpredictability built into them than others, and he measures it.
Harold Morowitz also makes this point:
[Shannon’s entropy] is a meaningful measure over any probability
distribution, while [Gibb’s thermodynamic entropy] has meaning only if
the p_i are the probabilities of a system being in the i^th quantum
state when the system is at equilibrium, as rigorously defined for
thermodynamics….[Shannon’s entropy] is a measure on a probability
distribution; it is not a physical quantity.” [Morowitz 1992]
This is obviously a pet peeve of mine. Welcome any comments!
Grant
p.s. Please excuse the top-posting and full quote... I had some trouble
discretizing your post.
--
glen e. p. ropella, 971-222-9095, http://agent-based-modeling.com
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
lectures, archives, unsubscribe, maps at http://www.friam.org