Pancomputationalism

Pancomputationalism

Pancomputationalism (Pan-computationalism, Naturalist computationalism) is a view that the universe is a huge computational machine or rather a network of computational processes which following fundamental physical laws compute (dynamically develop) its own next state from the current one.

In this approach the stuff of the universe is:
* Essentially informational
* Essentially digital
* Both digital and analog – depending on the level of abstraction

History

Every epoch and culture has a different conception of the Universe. For some it is, in its entirety, a living organism. [Viz. Thales of Miletus, Spinoza, and Kafatos & Nadeau 1999.] For Ptolemy, Descartes, and Newton the Universe was best conceived in a mechanistic way as some vast machine. Our current understanding in terms of information and computing has led to a conception of the Universe as, more or less explicitly, a computer.

In 1623, Galileo in his book The Assayer - Il Saggiatore, claimed that the language of nature's book is mathematics and that the way to understand nature is through mathematics. Pancomputationalism is generalizing ”mathematics” to ”computation” so the Galileo's great book of nature is more of a computer.

Konrad Zuse was the first to suggest (in 1967) that the physical behavior of the entire universe is being computed on a basic level, possibly on cellular automata, by the universe itself which he referred to as "Rechnender Raum" or Computing Space/Cosmos.

Pancomputationalists

* Konrad Zuse "Rechnender Raum" (translated by MIT into English as "Calculating Space", 1970)
* Norbert Wiener
* Edward Fredkin [Fredkin, Edward, "Digital Mechanics", Physica D, (1990), 254-270 North-Holland.]
* Stephen Wolfram [Wolfram, Stephen, "A New Kind of Science", (2002), Wolfram Science [http://www.wolframscience.com/] .]
* Gregory Chaitin [Chaitin, Gregory, "Epistemology as Information Theory: From Leibniz to Ω" [http://www.cs.auckland.ac.nz/~chaitin/ecap.html] .]
* Seth Lloyd [S. Lloyd, "The Computational Universe: Quantum gravity from quantum computation", [http://arxiv.org/abs/quant-ph/0501135 preprint] .]
* Gerard 't Hooft. [G. 't Hooft, "Quantum Gravity as a Dissipative Deterministic System", Class. Quant. Grav. 16, 3263-3279 (1999) [http://arxiv.org/abs/gr-qc/9903084 preprint] .]
* Charles Seife
* David Deutsch
* Max Tegmark and his Ultimate ensemble
* Jürgen Schmidhuber and his ultimate ensemble of all computable universes [J. Schmidhuber (1997): A Computer Scientist's View of Life, the Universe, and Everything. Lecture Notes in Computer Science, pp. 201-208, Springer: http://www.idsia.ch/~juergen/everything/ ] [J. Schmidhuber (2000): Algorithmic Theories of Everything http://arxiv.org/abs/quant-ph/0011122 ]
* Carl Friedrich von Weizsäcker and his quantum theory of ur-alternatives (Einheit der Natur, 1971)
* John Archibald Wheeler's "It from bit"

Computation

Computation is a process a physical system undergoes when processing information (computing).Computation as a phenomenon is studied within several research fields: Theory of Computation including Computability Theory, physics, biology and so on. According to ACM/IEEE Computing Curricula (2005) Computing field includes Computer Science, Computer Engineering, Software Engineering and Information Systems. The German, French and Italian languages use the respective terms "Informatik", "Informatique" and “Informatica” (Informatics in English) to denote Computing.

Pancomputationalism and Computational theory of mind

In a computing universe, human bodies and human minds are computational too - on several levels of granularity (levels of description). There are numerous indications from cognitive science and neuroscience which reveal computational character of cognition. Pancomputationalism offers an elegant solution to the controversy about digital vs. analog character of cognitive phenomena by suggesting that both sorts of explanations are necessary for the complete description of the observed behaviours.

Info-Computational Naturalism (ICON)

Info-Computational Naturalism (ICON) unifies pancomputationalism with paninformationalism, the view that the fabric of the universe is informational. ICON claims that while the structure of the universe is informational, its dynamics (change) is computation (information processing). [G. Dodig-Crnkovic, "Investigations into Information Semantics and Ethics of Computing", Mälardalen University Press, (2006), [http://www.diva-portal.org/mdh/theses/abstract.xsql?dbid=153] .]

Natural Computation Generalizing Turing Machine

Turing Machine (TM) model identifies computation with the execution of an algorithm, and the question is how widely it is applicable. Church-Turing Thesis establishes equivalence between a TM and an algorithm, which is often interpreted as to imply that all of computation necessarily must be algorithmic. With the advent of computer networks, which are the main paradigm of computing today, the model of a computer in isolation, represented by a Universal Turing Machine, has become insufficient.

The basic difference between an isolated computing box and a network of computational processes (nature understood as a computational mechanism) is the interactivity of computation. The most general computational paradigm today is interactive computing. [Goldin D, Smolka S and Wegner P Ed. (2007) "Interactive Computation: The New Paradigm", Springer-Verlag.] Consequently, in recent years, computability has expanded beyond its original TM scope.

The deep interconnection between "computation" and "proof" has resulted in significant work in constructive mathematics and mathematical logic ever since Hilbert’s famous program in the 1920s. Computation is fundamentally connected with logic, another research field which nowadays experiences rapid development. Understood in the most general, interactive sense, logic can be seen as games played by an agent against its environment. Computability of such problems means existence of an agent that always wins the game. Logical operators stand for operations on computational problems, and validity of a logical formula means being a scheme of "always computable" problems.

The challenge to deal with computability in the real world (such as computing on continuous data, biological computing/organic computing, and quantum computing, or generally natural computing) has brought new understanding of computation. Natural computing has different criteria for success of a computation, halting problem is not an issue, but instead the adequacy of the computational response in a network of interacting computational processes/devices. In many areas, we have to computationally model emergence not being clearly algorithmic. (Barry Cooper)

New computational paradigms are based on metaphors for natural phenomena, and computer simulations obtained from mimicking nature. New questions in focus are:
* What can be learned from natural computation? Is there any non-algorithmic computation?
* Is there a universal model (for which the TM model is a special case) underlying all natural computing? (Wegner and Goldin suggest Interactive Computing which uses paraconsistent logic for an open interactive system.)Biologists, information scientists, cognitive scientists, bioinformaticists, logicians, theoretical physicists and many other researchers are attracted by the possibilities which this new interactive computational paradigm opens.

Info-computational (ICON) Epistemology Naturalized

Naturalized epistemology (Feldman, Kornblith, Stich) is, in general, an idea that knowledge may be studied as a natural phenomenon -- that the subject matter of epistemology is not our concept of knowledge, but the knowledge itself.

“The stimulation of his sensory receptors is all the evidence anybody has had to go on, ultimately, in arriving at his picture of the world. Why not just see how this construction really proceeds? Why not settle for psychology?” (Epistemology Naturalized, Quine 1969)

Naturalist Understanding of Cognition

According to Maturana and Varela (1980) even the simplest organisms possess cognition and their meaning-production apparatus is contained in their metabolism. Of course, there are also non-metabolic interactions with the environment, such as locomotion, that also generates meaning for an organism by changing its environment and providing new input data.

Maturana’s and Varelas’ understanding of cognition is most suitable as the basis for a computationalist account of the naturalized evolutionary epistemology. A great conceptual advantage of cognition as a central focus of study is that all living organisms possess some cognition, in some degree.

Universe Computer, Not A Typewriter

What is the mechanism of the evolutionary development of cognitive abilities in organisms?

Critics of the evolutionary approach mention the impossibility of “blind chance” to produce such highly complex structures as intelligent living organisms. Proverbial monkeys typing Shakespeare are often used as an illustration.

Chaitin - Bennet counter argument: The universe is not a typewriter, but a computer, so a monkey types random input into a computer. “Quantum mechanics supplies the universe with “monkeys” in the form of random fluctuations, such as those that seeded the locations of galaxies. The computer into which they type is the universe itself.From a simple initial state, obeying simple physical laws, the universe has systematically processed and amplified the bits of information embodied in those quantum fluctuations. The result of this information processing is the diverse, information-packed universe we see around us: programmed by quanta, physics give rise first to chemistry and then to life; programmed by mutation and recombination, life gave rise to Shakespeare; programmed by experience and imagination, Shakespeare gave rise to Hamlet. You might say that the difference between a monkey at a typewriter and a monkey at a computer is all the difference in the world.“ (Lloyd 2006)

The universe computer on which a monkey types is at the same time the hardware and the program, in a way similar to the Turing machine. An example from biological computing is the DNA where the hardware (the molecule) is at the same time the software (the program, the code). In general, each new input restructures the computational universe and changes the preconditions for future inputs. Those processes are interactive and self-organizing. That makes the essential speed-up for the process of getting more and more complex structures.

References

External links

* [ftp://ftp.idsia.ch/pub/juergen/zuserechnenderraum.pdf Translation of Zuse's book "Rechnender Raum" in English ]
* [ftp://ftp.idsia.ch/pub/juergen/zuse67scan.pdf Scan of Zuse's "Rechnender Raum" paper in German]
* [http://www.idsia.ch/~juergen/digitalphysics.html Zuse's computing universe]
* [http://se10.comlab.ox.ac.uk:8080/InformaticPhenomena/IntroductiontoOASIS_en.html The Oxford Advanced Seminar on Informatic Structures]
* [http://www.amsta.leeds.ac.uk/cie/ Computability in Europe European Research Network]
* [http://www.interdisciplines.org/ Interdisciplines]
* [http://fis.icts.sbg.ac.at/ Foundations of Information Science]
* [http://pcs.essex.ac.uk/ Philosophy of computer science at the University of Essex]
* [http://www.crumpled.com/cp/index.html Computational Philosophy]
* [http://people.pwf.cam.ac.uk/mds26/cogsci/program.html Computation and Cognitive Science, King's College Cambridge]

See also

* Digital physics
* Computing
* Physical information
* Real computation
* Reversible computation
* Theory of computation
* Hypercomputation
* Computational theory of mind


Wikimedia Foundation. 2010.

Игры ⚽ Нужно решить контрольную?

Look at other dictionaries:

  • Digital physics — In physics and cosmology, digital physics is a collection of theoretical perspectives based on the premise that the universe is, at heart, describable by information, and is therefore computable. Therefore, the universe can be conceived as either …   Wikipedia

  • Computation — is defined as any type of calculation.[1] Also defined as use of computer technology in Information processing.[2][3]Computation is a process following a well defined model understood and expressed in an algorithm, protocol, network topology, etc …   Wikipedia

  • Cosmology (metaphysics) — For the general article on cosmology, see Cosmology. Cosmology in metaphysics is the reflection on the totality of all phenomena. It contrasts with physical cosmology, the study of the origin of the universe in scientific terms after the… …   Wikipedia

  • Philosophy of information — The philosophy of information (PI) is the area of research that studies conceptual issues arising at the intersection of computer science, information technology, and philosophy. It includes: [Luciano Floridi,… …   Wikipedia

  • Margolus–Levitin theorem — The Margolus–Levitin theorem, named for Norman Margolus and Lev B. Levitin, gives a fundamental limit on quantum computation (strictly speaking on all forms on computation). The processing rate cannot be higher than 6 × 1033 operations per second …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”