- Singularitarianism
Singularitarianism is a
moral philosophy based upon the belief that atechnological singularity — the technological creation of smarter-than-humanintelligence — is possible, and advocating deliberate action to bring it into effect and ensure its safety. While many futurists and transhumanists speculate on the possibility and nature of this technological development (often referred to as "the Singularity"), Singularitarians believe it is not only possible, but desirable if, and only if, guided safely. Accordingly, they might sometimes "dedicate their lives" to acting in ways they believe will contribute to its safe implementation.The term "singularitarian" was originally defined by
Extropian Mark Plus in 1991 to mean "one who believes the concept of a Singularity". This term has since been redefined to mean "Singularity activist" or "friend of the Singularity"; that is, one who acts so as to bring about the Singularity. [http://www.extropy.org/neologo.htm#s Neologisms of Extropy]Ray Kurzweil , the author of the book "The Singularity is Near ", defines a Singularitarian as someone "who understands the Singularity and who has reflected on its implications for his or her own life".The Singularity is Near - Chapter One (The Six Epochs)]Beliefs
In his 2000 essay, "Singularitarian Principles",
Eliezer Yudkowsky writes that there are four qualities that define a Singularitarian: [ [http://yudkowsky.net/sing/principles.html Singularitarian Principles] "]
*A Singularitarian believes that the Singularity is possible and desirable.
*A Singularitarian actually "works" to bring about the Singularity.
*A Singularitarian views the Singularity as an entirely secular, non-mystical process — not the culmination of any form of religious prophecy or destiny.
*A Singularitarian believes the Singularity should benefit the entire world, and should not be a means to benefit any specific individual or group.In July 2000 Eliezer Yudkowsky, Brian Atkins and Sabine Atkins founded the
Singularity Institute for Artificial Intelligence to work towards the creation of self-improvingFriendly AI . The Singularity Institute's writings argue for the idea that an AI with the ability to improve upon its own design (Seed AI ) would rapidly lead tosuperintelligence . Singularitarians believe that reaching the Singularity swiftly and safely is the best possible way to minimize netexistential risk .Many believe a technological singularity is possible without adopting Singularitarianism as a moral philosophy. Although the exact numbers are hard to quantify, Singularitarianism is presently a small movement. Other prominent Singularitarians include
Ray Kurzweil andNick Bostrom .ee also
*
Extropianism
*Seed AI — a theory closely associated with Singularitarianism
*Simulated reality — analysis of potential technologically based realityReferences
External links
* [http://www.singinst.org/why-singularity.html Why Work Towards the Singularity?] by Eliezer Yudkowsky
* [http://www.nickbostrom.com/ethics/ai.html Ethical Issues in Advanced Artificial Intelligence] by Nick Bostrom
Wikimedia Foundation. 2010.