Hmmm . . . I would say this just slightly differently -- the amount of information an observer gains from observing an event is equal to the decrease in uncertainty the observer has from observing the event (e.g., if I am almost certain an event will occur, I gain almost no information from observing the event; on the other hand, if I observe an event I was very unsure would happen, I gain a lot of information). Decrease in uncertainty and gain in information are just two ways of talking about the same quantity.
I'll also make the observation that, for me, information is not a property of an event, but rather of a combined system of event and observer. In particular, two different observers can gain different amounts of information from observing the same event (think about two students attending my lecture on information theory -- if one of them has been through my lecture several times before, they know what to expect, and hence have comparatively little uncertainty about what they will hear, and hence gain little information when they hear it . . .). This is part of why it is valuable to think of the quantity as being uncertainty decrease, rather than information gain -- it keeps some more emphasis on the observer, whose uncertainty is being decreased . . . Thanks . . . tom On Jun 5, 2011, at 8:06 PM, Grant Holland wrote: > Interesting note on "information" and "uncertainty"... > > Information is Uncertainty. The two words are synonyms. > > Shannon called it "uncertainty", contemporary Information theory calls it > "information". > > It is often thought that the more information there is, the less uncertainty. > The opposite is the case. > > In Information Theory (aka the mathematical theory of communications) , the > degree of information I(E) - or uncertainty U(E) - of an event is measurable > as an inverse function of its probability, as follows: > > U(E) = I(E) = log( 1/Pr(E) ) = log(1) - log( Pr(E) ) = -log( Pr(E) ). > > Considering I(E) as a random variable, Shannon's entropy is, in fact, the > first moment (or expectation) of I(E). Shannon entropy = exp( I(E) ). > > Grant > > On 6/5/2011 2:20 PM, Steve Smith wrote: >> >> >> >> "Philosophy is to physics as pornography is to sex. It's cheaper, it's >> easier and some people seem to prefer it." >> >> Modern Physics is contained in Realism which is contained in Metaphysics >> which I contained in all of Philosophy. >> >> I'd be tempted to counter: >> "Physics is to Philosophy as the Missionary Position is to the Kama Sutra" >> >> Physics also appeals to Phenomenology and Logic (the branch of Philosophy >> were Mathematics is rooted) and what we can know scientifically is >> constrained by Epistemology (the nature of knowledge) and phenomenology (the >> nature of conscious experience). >> >> It might be fair to say that many (including many of us here) who hold >> Physics up in some exalted position simply dismiss or choose to ignore all >> the messy questions considered by *the rest of* philosophy. Even if we >> think we have clear/simple answers to the questions, I do not accept that >> the questions are not worthy of the asking. >> >> The underlying point of the referenced podcast is, in fact, that Physics, or >> Science in general might be rather myopic and limited by it's own viewpoint >> by definition. >> >> "The more we know, the less we understand." >> >> Philosophy is about understanding, physics is about knowledge first and >> understanding only insomuch as it is a part of natural philosophy. >> >> Or at least this is how my understanding is structured around these matters. >> >> - Steve >>> On Sun, Jun 5, 2011 at 1:15 PM, Robert Holmes <rob...@holmesacosta.com> >>> wrote: >>> >From the BBC's science podcast "The Infinite Monkey Cage": >>> >>> "Philosophy is to physics as pornography is to sex. It's cheaper, it's >>> easier and some people seem to prefer it." >>> >>> Not to be pedantic, but I suspect that s/he has conflated "philosophy" with >>> "new age", as much of science owes itself to philosophy. >>> >>> marcos >>> >>> ============================================================ >>> FRIAM Applied Complexity Group listserv >>> Meets Fridays 9a-11:30 at cafe at St. John's College >>> lectures, archives, unsubscribe, maps at http://www.friam.org >> >> >> ============================================================ >> FRIAM Applied Complexity Group listserv >> Meets Fridays 9a-11:30 at cafe at St. John's College >> lectures, archives, unsubscribe, maps at http://www.friam.org > ============================================================ > FRIAM Applied Complexity Group listserv > Meets Fridays 9a-11:30 at cafe at St. John's College > lectures, archives, unsubscribe, maps at http://www.friam.org
============================================================ FRIAM Applied Complexity Group listserv Meets Fridays 9a-11:30 at cafe at St. John's College lectures, archives, unsubscribe, maps at http://www.friam.org