Glick, The Information, making meaning and reading

I want to quote a full passage from Glick and then discuss it:
"There was a difference in emphasis between Shannon and Wiener. For Wiener, entropy was a measure of disorder; for Shannon, of uncertainty. Fundamentally, as they were realizing, these were the same. The more inherent order exists in a sample of English text--order in the form of statistical patterns, known consciously or unconsciously to speakers of the language--the more predictability there is, and in Shannon's terms, the less information is conveyed by each subsequent letter. When the subject guesses the next letter with confidence  it is redundant, and the arrival of the letter contributes no new information. Information is surprise." (p. 247).

It seemed like I was seeing this in action as I typed--typing fast relies on predictability (or memorization?). Are these different?  I was thinking of reading as I read this and how the more I read, the more predictable what I'm reading becomes and the less information I receive from the reading.  In fact, I feel like just by looking at a few sentences in a paragraph or even a few words I know the whole paragraph (in some writings--those that are more predictable).


This takes me back to the "story, study, lesson" of earlier posts. Once I start to read the story it seems like I already know the lesson. But does this come from "predictability" or memory and how are these related?

The more disorder there is in information the less we can understand it and the more we have redundancy to ensure the communication is clear. Do we have less redundancy when the meaning is more clear? So, is our job as writers (or computer programmers) to create as much predictability as possible? But if information is surprise and NOT predictability, then doesn't learning require entropy?

Is reading (as meaning making) in some sense creating order from entropy?

Is meaning making different when we are talking about information and when we are talking about knowledge? This then leads to the earlier discussion of Spinoza and the notion of when is something knowledge? In The Information, Glick relates Margaret Mead's response to the information theory discussion that the above quote is part of, and she argues that information theory should be "signal theory" because first it is just signals that have to be transformed into meaning. At what point is it "information?"

And we are talking, I think, of at least two different kinds of information, perhaps something like basic and more complex, because a word is "information" once you know the word and you can know the word before it is completely before you, in fact, in you can know if from the previous word many times. And that might be called "basic" information that doesn't really create meaning because the word means nothing without a context. Then there is information in context which can also be known before it is all before you. Both of these types of information are the process of turning order out of disorder.

Is the brain "self-organizing" (p. 253). If not, what's the alternative?

(struggling with defining/understanding entropy-when things are at total rest, they have no entropy. When things are in chaos or struggling or moving around or exchanging heat, they have more or less entropy. We know the energy of the universe is constant so entropy is also the measure of how much flux there is in this energy. Analogy? perhaps we can think of a bathtub filled with water and you have a way to separate half the water from the other half. One half is cold, the other warm. just sitting there (yes I know the atoms are moving around) but visibly the water is doing nothing it has little entropy. You remove the divider and the two mix, you increase entropy. But the other interesting thing about this is you can't ever put the water back in their separate packets. In that sense the entropy is irreversible (this is also the definition of time--as entropy moves forward, it cannot be reversed). The universe natural pull is entropy, to stay in disorder. We are always trying to impose order. In information, we take random symbols and try to impose order upon them.

Comments

Popular Posts