TPF : Information and Randomness
Posted: Sun Apr 07, 2024 4:20 pm
The greatest degree of information is found in the most random or irrational sequences.
I find this strange and counter intuitive. — Benj96
That common mis-understanding of Information theory is indeed counterintuitive, because we know from experience that randomness is the antithesis of meaning-bearing Information. But Shannon was not claiming that random sequences are inherently meaningful. Instead, he compared mental Information to physical Entropy. And noted that it is "surprising" to find meaningful information in random patterns*1. That eye-opening distinction of meaning from background noise is what semiotician & cyberneticist Bateson called "the difference that makes a difference"*2. . The first "difference" is the Surprise, and the second is the Meaning.
According to the second law of thermodynamics, all order ultimately decays into disorder. And yet, here we stand on a tiny exception to that rule in the vast universe : the pocket of organization we call home. As far as we know, this is the only instance of Life & Mind in the universe. Ironically, some thinkers miss the exceptional nature of Information, and are still looking for communications from little green men, or the advanced race of San-Ti, out there in the near infinite crucible of random accidents. Information is not accidental.
*1. Information is the surprising exception to common randomness :
The core idea of information theory is that the "informational value" of a communicated message depends on the degree to which the content of the message is surprising.
https://en.wikipedia.org/wiki/Entropy_( ... on_theory)
*2. Information as a difference :
We propose a difference theory of information that extends Gregory Bateson’s definition that information is any difference that makes a difference.
https://www.tandfonline.com/doi/full/10 ... 19.1581441
CAN YOU SEE THE DIFFERENCE ?
Surprising Signal within Meaningless Noise
static.png
I find this strange and counter intuitive. — Benj96
That common mis-understanding of Information theory is indeed counterintuitive, because we know from experience that randomness is the antithesis of meaning-bearing Information. But Shannon was not claiming that random sequences are inherently meaningful. Instead, he compared mental Information to physical Entropy. And noted that it is "surprising" to find meaningful information in random patterns*1. That eye-opening distinction of meaning from background noise is what semiotician & cyberneticist Bateson called "the difference that makes a difference"*2. . The first "difference" is the Surprise, and the second is the Meaning.
According to the second law of thermodynamics, all order ultimately decays into disorder. And yet, here we stand on a tiny exception to that rule in the vast universe : the pocket of organization we call home. As far as we know, this is the only instance of Life & Mind in the universe. Ironically, some thinkers miss the exceptional nature of Information, and are still looking for communications from little green men, or the advanced race of San-Ti, out there in the near infinite crucible of random accidents. Information is not accidental.
*1. Information is the surprising exception to common randomness :
The core idea of information theory is that the "informational value" of a communicated message depends on the degree to which the content of the message is surprising.
https://en.wikipedia.org/wiki/Entropy_( ... on_theory)
*2. Information as a difference :
We propose a difference theory of information that extends Gregory Bateson’s definition that information is any difference that makes a difference.
https://www.tandfonline.com/doi/full/10 ... 19.1581441
CAN YOU SEE THE DIFFERENCE ?
Surprising Signal within Meaningless Noise
static.png