Pancomputationalism

Pancomputationalism (Pan-computationalism, Naturalist computationalism) is a view that the universe is a huge computational machine or rather a network of computational processes which following fundamental physical laws compute (dynamically develop) its own next state from the current one.
In this approach the stuff of the universe is:
* Essentially informational
* Essentially digital
* Both digital and analog - depending on the level of abstraction
History
Every epoch and culture has a different conception of the Universe. For some it is, in its entirety, a living organism. For Ptolemy, Descartes, and Newton the Universe was best conceived in a mechanistic way as some vast machine. Our current understanding in terms of information and computing has led to a conception of the Universe as, more or less explicitly, a computer.
In 1623, Galileo in his book The Assayer - Il Saggiatore, claimed that the language of nature's book is mathematics and that the way to understand nature is through mathematics. Pancomputationalism is generalizing ”mathematics” to ”computation” so the Galileo's great book of nature is more of a computer.
Konrad Zuse was the first to suggest (in 1967) that the physical behavior of the entire universe is being computed on a basic level, possibly on cellular automata, by the universe itself which he referred to as "Rechnender Raum" or Computing Space/Cosmos.
Pancomputationalists
* Konrad Zuse Rechnender Raum (translated by MIT into English as Calculating Space, 1970)
* Norbert Wiener
* Edward Fredkin
* Stephen Wolfram
* Gregory Chaitin
* Seth Lloyd
* Gerard 't Hooft.
* Charles Seife
* David Deutsch
* Max Tegmark and his Ultimate ensemble
* Jürgen Schmidhuber and his ultimate ensemble of all computable universes
* Aaron Sloman's - Virtual machines
* Peter J. Bentley - Systemic computation
* Carl Friedrich von Weizsäcker and his quantum theory of ur-alternatives (Einheit der Natur, 1971)
* John Archibald Wheeler's "It from bit"
Computation
Computation is a process a physical system undergoes when processing information (computing).
Computation as a phenomenon is studied within several research fields: Theory of Computation including Computability Theory, physics, biology and so on. According to ACM/IEEE Computing Curricula (2005) Computing field includes Computer Science, Computer Engineering, Software Engineering and Information Systems. The German, French and Italian languages use the respective terms "Informatik", "Informatique" and “Informatica” (Informatics in English) to denote Computing.
Pancomputationalism and Computational theory of mind
In a computing universe, human bodies and human minds are computational too - on several levels of granularity (levels of description). There are numerous indications from cognitive science and neuroscience which reveal computational character of cognition. Pancomputationalism offers an elegant solution to the controversy about digital vs. analog character of cognitive phenomena by suggesting that both sorts of explanations are necessary for the complete description of the observed behaviours.
Info-Computational Naturalism (ICON)
Info-Computational Naturalism (ICON) unifies pancomputationalism with paninformationalism, the view that the fabric of the universe is informational. ICON claims that while the structure of the universe is informational, its dynamics (change) is computation (information processing).
Natural Computation Generalizing Turing Machine
Turing Machine (TM) model identifies computation with the execution of an algorithm, and the question is how widely it is applicable. Church-Turing Thesis establishes equivalence between a TM and an algorithm, which is often interpreted as to imply that all of computation necessarily must be algorithmic. With the advent of computer networks, which are the main paradigm of computing today, the model of a computer in isolation, represented by a Universal Turing Machine, has become insufficient.
The basic difference between an isolated computing box and a network of computational processes (nature understood as a computational mechanism) is the interactivity of computation. The most general computational paradigm today is interactive computing.
Consequently, in recent years, computability has expanded beyond its original TM scope.
The deep interconnection between "computation" and "proof" has resulted in significant work in constructive mathematics and mathematical logic ever since Hilbert’s famous program in the 1920s. Computation is fundamentally connected with logic, another research field which nowadays experiences rapid development. Understood in the most general, interactive sense, logic can be seen as games played by an agent against its environment. Computability of such problems means existence of an agent that always wins the game. Logical operators stand for operations on computational problems, and validity of a logical formula means being a scheme of "always computable" problems.
The challenge to deal with computability in the real world (such as computing on continuous data, biological computing/organic computing, and quantum computing, or generally natural computing) has brought new understanding of computation.
Natural computing has different criteria for success of a computation, halting problem is not an issue, but instead the adequacy of the computational response in a network of interacting computational processes/devices. In many areas, we have to computationally model emergence not being clearly algorithmic. (Barry Cooper)
New computational paradigms are based on metaphors for natural phenomena, and computer simulations obtained from mimicking nature. New questions in focus are:
* What can be learned from natural computation? Is there any non-algorithmic computation?
* Is there a universal model (for which the TM model is a special case) underlying all natural computing? (Wegner and Goldin suggest Interactive Computing which uses paraconsistent logic for an open interactive system.)
Biologists, information scientists, cognitive scientists, bioinformaticists, logicians, theoretical physicists and many other researchers are attracted by the possibilities which this new interactive computational paradigm opens.
Info-computational (ICON) Epistemology Naturalized
Naturalized epistemology (Feldman, Kornblith, Stich) is, in general, an idea that knowledge may be studied as a natural phenomenon -- that the subject matter of epistemology is not our concept of knowledge, but the knowledge itself.
“The stimulation of his sensory receptors is all the evidence anybody has had to go on, ultimately, in arriving at his picture of the world. Why not just see how this construction really proceeds? Why not settle for psychology?” (Epistemology Naturalized, Quine 1969)
Pancomputationalists re-phrase the question to be: “Why not settle for computing?” (Dodig Crnkovic 2006)
Naturalist Understanding of Cognition
According to Maturana and Varela (1980) even the simplest organisms possess cognition and their meaning-production apparatus is contained in their metabolism. Of course, there are also non-metabolic interactions with the environment, such as locomotion, that also generates meaning for an organism by changing its environment and providing new input data.
Maturana’s and Varelas’ understanding of cognition is most suitable as the basis for a computationalist account of the naturalized evolutionary epistemology.
A great conceptual advantage of cognition as a central focus of study is that all living organisms possess some cognition, in some degree.
Universe Computer, Not A Typewriter
What is the mechanism of the evolutionary development of cognitive abilities in organisms?
Critics of the evolutionary approach mention the impossibility of “blind chance” to produce such highly complex structures as intelligent living organisms. Proverbial monkeys typing Shakespeare are often used as an illustration.
Chaitin - Bennett counter argument: The universe is not a typewriter, but a computer, so a monkey types random input into a computer.
“Quantum mechanics supplies the universe with “monkeys” in the form of random fluctuations, such as those that seeded the locations of galaxies. The computer into which they type is the universe itself.
From a simple initial state, obeying simple physical laws, the universe has systematically processed and amplified the bits of information embodied in those quantum fluctuations.
The result of this information processing is the diverse, information-packed universe we see around us: programmed by quanta, physics give rise first to chemistry and then to life; programmed by mutation and recombination, life gave rise to Shakespeare; programmed by experience and imagination, Shakespeare gave rise to Hamlet.
You might say that the difference between a monkey at a typewriter and a monkey at a computer is all the difference in the world.“ (Lloyd 2006)
The universe computer on which a monkey types is at the same time the hardware and the program, in a way similar to the Turing machine.
An example from biological computing is the DNA where the hardware (the molecule) is at the same time the software (the program, the code). In general, each new input restructures the computational universe and changes the preconditions for future inputs. Those processes are interactive and self-organizing. That makes the essential speed-up for the process of getting more and more complex structures.
 
< Prev   Next >