The Conservation of Information and The Scandal of Deduction
https://thephilosophyforum.com/discussi ... ent/816530
There is a (contested) claim in physics that information cannot be created or destroyed. While thinking this through, it occured to me that formulations of this claim often rely on the "scandal of deduction," the idea that deductive reasoning produces no information. . . . At first glance this seems wrong. Our universe began in a very low entropy state, . . . — Count Timothy von Icarus
Since I have no formal training in Philosophy or higher Math, my comments on the notion that "deductive reasoning produces no information", may not be of high quality. To clearly define the ratio of "reasoning" to Information content, could get into some tricky reasoning beyond my limited abilities. But, since my amateur philosophical retirement hobby is based on Quantum & Information theories, I may be able to make some pertinent comments. I hope they are not off-topic.
First, the quoted phrase above, parallels the Second Law of Thermodynamics, and it seems to equate Information with Energy. Indeed, some scientists today do make that equation*1. Since the second law implies that the universe began with a limited amount of Energy/Information, some might infer that maximum Entropy at "Heat Death" would mean Zero Energy & Zero Information : hence Zero Universe. But, is it possible that local energy/information users could import energy from the almost empty vastness of space-time*2, in order to optimize their local throughput of energy & information? That may be how some sci-fi alien species are able to become entrepreneurial space-farers, exploring & conquering less efficient (or less-fortunate) species, such as Earthlings.
In that case, with the Quality of Information in mind, instead of just Quantity, the value of Information, would be, not just conserved but compounded, like money in an interest-bearing account, or in risky-but-high-yield investments. Relative to the whole universe, increasing Entropy is equivalent to a negative interest rate, such as monetary inflation. Our Cosmos is a closed banking system, but it began with a stock-pile of low entropy-high energy/information. So, the evolution of increasing intelligence (information quality ) could conceivably offset the energy inflation (entropy) of expansion without new inputs*3.
Therefore, I agree that Deduction shouldn't be scandalous, just because, unlike Induction/Production it doesn't output new information (facts). On the other hand, by testing old information (beliefs), Deduction should produce better quality information. For an evolutionary analogy, animals that developed intelligence, in response to environmental stresses, gradually evolved into new species. Physically, still animals, but metaphysically a new species (homo economicus), that is more efficient in processing information mentally. No longer dependent on teeth & claws, they use imagination & ideas to make their fortune in the world.
PS___Energy and Information are not the same thing, but different forms of the same thing : Potential.
*1. Information converted to energy :
Toyabe and colleagues have observed this energy-information equivalence . . .
https://physicsworld.com/a/information- ... to-energy/
*2. Vacuum Energy :
Vacuum energy is an underlying background energy that exists in space throughout the entire Universe.
https://en.wikipedia.org/wiki/Vacuum_energy
*3. Weight of Evidence and Information Value :
In data analysis, Logistic Regression models take as input categorical and numerical data, and output the probability of the occurrence of the event.
https://www.analyticsvidhya.com/blog/20 ... ion-value/
Note --- Shannon's mathematical definition of Information is based on statistical Probability
TPF : Conservation of Information
Re: TPF : Conservation of Information
There is a (contested) claim in physics that information cannot be created or destroyed. . . . The claim here is that, even if T1 can be described in fewer bits than T2, you can just evolve T1 into T2, thus a description of T1 actually is a description of T2! This implies the scandal of deduction, that nothing new is learned from deterministic computation. — Count Timothy von Icarus
The causal role of information in the world is of interest to me, both scientifically and philosophically. Can you provide a link to a site or publication where that "claim" is made? It might clarify any presumptions behind such an assertion.
I'd be surprised if materialist/physicalist/deterministic scientists would think in terms of "learning" in a law-limited "deterministic" system*1. However computer scientists, and Information theorists, do sometimes use such anthro-morphic terminology metaphorically*2. So, if the "laws of nature" are imagined as a computer program, the universe could conceivably learn, in the same sense that Artificial Intelligence does*3, by means of "non-deterministic algorithms"*4.
But AI is not natural, and currently requires a natural Programmer to establish the parameters of the system. Would a self-organizing, self-learning world also require the services of a preter-natural Programmer to bootstrap the system?
As I mentioned above, this kind of sci-phi (science/philosophy) is over my head. But I'm learning. Does that mean I was not destined to post on such an abstruse question?
*1. What is physics-informed learning? :
What is physics-informed machine learning? Machine learning is a branch of artificial intelligence and computer science that focuses on the use of data and algorithms that attempt to imitate the function of the human brain, improving in accuracy over time.
https://www.pnnl.gov/explainer-articles ... e-learning
*2. Physicists working with Microsoft think the universe is a self-learning computer :
https://thenextweb.com/news/physicists- ... g-computer
*3. Causal Determinism :
Causal determinism is, roughly speaking, the idea that every event is necessitated by antecedent events and conditions together with the laws of nature.
https://plato.stanford.edu/entries/determinism-causal/
*4. What is a nondeterministic algorithm? :
A nondeterministic algorithm is an algorithm that, given a particular input, can produce different outputs. This is in contrast to a deterministic algorithm, which will always produce the same output for a given input. Nondeterministic algorithms are often used in artificial intelligence (AI) applications.
https://www.aiforanyone.org/glossary/no ... -algorithm
The causal role of information in the world is of interest to me, both scientifically and philosophically. Can you provide a link to a site or publication where that "claim" is made? It might clarify any presumptions behind such an assertion.
I'd be surprised if materialist/physicalist/deterministic scientists would think in terms of "learning" in a law-limited "deterministic" system*1. However computer scientists, and Information theorists, do sometimes use such anthro-morphic terminology metaphorically*2. So, if the "laws of nature" are imagined as a computer program, the universe could conceivably learn, in the same sense that Artificial Intelligence does*3, by means of "non-deterministic algorithms"*4.
But AI is not natural, and currently requires a natural Programmer to establish the parameters of the system. Would a self-organizing, self-learning world also require the services of a preter-natural Programmer to bootstrap the system?
As I mentioned above, this kind of sci-phi (science/philosophy) is over my head. But I'm learning. Does that mean I was not destined to post on such an abstruse question?
*1. What is physics-informed learning? :
What is physics-informed machine learning? Machine learning is a branch of artificial intelligence and computer science that focuses on the use of data and algorithms that attempt to imitate the function of the human brain, improving in accuracy over time.
https://www.pnnl.gov/explainer-articles ... e-learning
*2. Physicists working with Microsoft think the universe is a self-learning computer :
https://thenextweb.com/news/physicists- ... g-computer
*3. Causal Determinism :
Causal determinism is, roughly speaking, the idea that every event is necessitated by antecedent events and conditions together with the laws of nature.
https://plato.stanford.edu/entries/determinism-causal/
*4. What is a nondeterministic algorithm? :
A nondeterministic algorithm is an algorithm that, given a particular input, can produce different outputs. This is in contrast to a deterministic algorithm, which will always produce the same output for a given input. Nondeterministic algorithms are often used in artificial intelligence (AI) applications.
https://www.aiforanyone.org/glossary/no ... -algorithm
Re: TPF : Conservation of Information
"Information" is very tough term because it is defined loads of different ways. I suppose here I should have used "Kolmogorov Complexity," in every instance here. This is a measure of how many bits it takes to describe something (really how many bits a computer program would need to be to produce an output of a full description).
So, that said, I would think that the "heat death," scenario, where the universe is in thermodynamic equilibrium, would have the greatest complexity/take the most bits to describe as a description of its macroproperties excludes a maximal number of possible microstates that must be excluded by specifying information. — Count Timothy von Icarus
Sorry, "Kolmogorov Complexity"*1 is way over my little pointy head. And, while your comments in the quote may have something to do with the hypothesis that the universe is a computer program, it doesn't address my request*2 for a link to the Second Law assertion.
The Santa Fe Institute studies physical Complexity from an information theoretic perspective. And they can get into some pretty abstruse word-salads. But my interest is much simpler, and primarily regarding the relationship between Meaningful Information and Causal Energy : i.e. Information defined in terms of Energy. Hence, I'd like to know more about the implications of the Second Law on the Conservation of Information.
So, I'll ask again : "Can you provide a link to a site or publication where that "claim" is made? It might clarify any presumptions behind such an assertion". I'm really interested in the basis of that claim. Not at all interested in "Kolmogorov Complexity".
*1. Kolmogorov complexity :
The notion of Kolmogorov complexity can be used to state and prove impossibility results akin to Cantor's diagonal argument, Gödel's incompleteness theorem, and Turing's halting problem. In particular, no program P computing a lower bound for each text's Kolmogorov complexity can return a value essentially larger than P's own length (see section § Chaitin's incompleteness theorem); hence no single program can compute the exact Kolmogorov complexity for infinitely many texts.
https://en.wikipedia.org/wiki/Kolmogorov_complexity
Note --- "prove impossibility results"???? Not even in the ballpark of my layman dilettante vocabulary.
*2. I don't know what the reference to "impossible complexity" has to do with this claim :
"There is a (contested) claim in physics that information cannot be created or destroyed". I was hoping you could point me to the source of that claim, so I could understand its implications : e.g. Metaphysical Information is as fundamental/essential to reality as Physical Energy.
11 minutes ago
Options
So, that said, I would think that the "heat death," scenario, where the universe is in thermodynamic equilibrium, would have the greatest complexity/take the most bits to describe as a description of its macroproperties excludes a maximal number of possible microstates that must be excluded by specifying information. — Count Timothy von Icarus
Sorry, "Kolmogorov Complexity"*1 is way over my little pointy head. And, while your comments in the quote may have something to do with the hypothesis that the universe is a computer program, it doesn't address my request*2 for a link to the Second Law assertion.
The Santa Fe Institute studies physical Complexity from an information theoretic perspective. And they can get into some pretty abstruse word-salads. But my interest is much simpler, and primarily regarding the relationship between Meaningful Information and Causal Energy : i.e. Information defined in terms of Energy. Hence, I'd like to know more about the implications of the Second Law on the Conservation of Information.
So, I'll ask again : "Can you provide a link to a site or publication where that "claim" is made? It might clarify any presumptions behind such an assertion". I'm really interested in the basis of that claim. Not at all interested in "Kolmogorov Complexity".
*1. Kolmogorov complexity :
The notion of Kolmogorov complexity can be used to state and prove impossibility results akin to Cantor's diagonal argument, Gödel's incompleteness theorem, and Turing's halting problem. In particular, no program P computing a lower bound for each text's Kolmogorov complexity can return a value essentially larger than P's own length (see section § Chaitin's incompleteness theorem); hence no single program can compute the exact Kolmogorov complexity for infinitely many texts.
https://en.wikipedia.org/wiki/Kolmogorov_complexity
Note --- "prove impossibility results"???? Not even in the ballpark of my layman dilettante vocabulary.
*2. I don't know what the reference to "impossible complexity" has to do with this claim :
"There is a (contested) claim in physics that information cannot be created or destroyed". I was hoping you could point me to the source of that claim, so I could understand its implications : e.g. Metaphysical Information is as fundamental/essential to reality as Physical Energy.
11 minutes ago
Options
Who is online
Users browsing this forum: No registered users and 6 guests