- History of computer science
The history of computer science began long before the modern discipline of
computer science that emerged in the twentieth century. The progression, from mechanical inventions and mathematical theories towards the modern concepts and machines, formed a major academic field and the basis of a massive world-wide industry.Early history
Early computation
The earliest known tool for use in computation was the
abacus , and it was thought to have been invented inBabylon circa2400 BCE . Its original style of usage was by lines drawn in sand with pebbles. This was the first known computer and most advanced system of calculation known to date - preceding Greek methods by 2,000 years. Abaci of a more modern design are still used as calculation tools today.In
1115 BCE , theSouth Pointing Chariot was invented inancient China . It was the first knowngear ed mechanism to use adifferential gear , which was later used inanalog computer s. The Chinese also invented a more sophisticated abacus from around the2nd century BCE , known as theChinese abacus .In the
5th century BCE in ancient India, thegrammarian Pāṇini formulated thegrammar ofSanskrit in 3959 rules known as theAshtadhyayi which was highly systematized and technical. Panini used metarules,transformation s andrecursion s with such sophistication that his grammar had thecomputing power equivalent to aTuring machine . Between200 BCE and 400 CE, Jaina mathematicians in India invented thelogarithm . From the13th century , logarithmic tables were produced by Muslim mathematicians.The
Antikythera mechanism is believed to be the earliest known mechanical analog computer. [ [http://www.antikythera-mechanism.gr/project/general/the-project.html "The Antikythera Mechanism Research Project"] , The Antikythera Mechanism Research Project. Retrieved 2007-07-01] It was designed to calculate astronomical positions. It was discovered in 1901 in theAntikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to "circa" 100 BC.Mechanical analog computer devices appeared again a thousand years later in the medieval Islamic world and were developed by Muslim astronomers, such as the
equatorium by Arzachel, [Harvard reference |last=Hassan |first=Ahmad Y. |authorlink=Ahmad Y Hassan |url=http://www.history-science-technology.com/Articles/articles%2071.htm |title=Transfer Of Islamic Technology To The West, Part II: Transmission Of Islamic Engineering |accessdate=2008-01-22] the mechanical gearedastrolabe byAbū Rayhān al-Bīrūnī ,cite web|url=http://www.usc.edu/dept/MSA/introduction/woi_knowledge.html|title=Islam, Knowledge, and Science|publisher=University of Southern California |accessdate=2008-01-22] and thetorquetum byJabir ibn Aflah . [citation|first=R. P.|last=Lorch|title=The Astronomical Instruments of Jabir ibn Aflah and the Torquetum|journal=Centaurus|volume=20|issue=1|year=1976|pages=11-34] The first programmable machines were also invented by Muslim engineers, such as the automaticflute player by theBanū Mūsā brothersTeun Koetsier (2001). "On the prehistory of programmable machines: musical automata, looms, calculators", "Mechanism and Machine theory" 36, p. 590-591.] and thehumanoid robot s byAl-Jazari . [ [http://www.shef.ac.uk/marcoms/eview/articles58/robot.html A 13th Century Programmable Robot] ,University of Sheffield ] Muslim mathematicians also made important advances incryptography , such as the development ofcryptanalysis andfrequency analysis by Alkindus. [Simon Singh, "The Code Book", pp. 14-20] [cite web |url=http://www.muslimheritage.com/topics/default.cfm?ArticleID=372 |title= Al-Kindi, Cryptgraphy, Codebreaking and Ciphers |accessdate=2007-01-12 |format= HTML |work= ]When
John Napier discovered logarithms for computational purposes in the early17th century , there followed a period of considerable progress by inventors and scientists in making calculating tools.None of the early computational devices were really
computer s in the modern sense, and it took considerable advancement in mathematics and theory before the first modern computers could be designed.Algorithms
In the
7th century , Indian mathematicianBrahmagupta gave the first explanation of theHindu-Arabic numeral system and the use of zero as both aplaceholder and adecimal digit .Approximately around the year
825 , Persian mathematicianAl-Khwarizmi wrote a book, "On the Calculation with Hindu Numerals", that was principally responsible for the diffusion of the Indian system of numeration in theMiddle East and thenEurope . Around the12th century , there was translation of this book written intoLatin : "Algoritmi de numero Indorum". These books presented newer concepts to perform a series of steps in order to accomplish a task such as the systematic application of arithmetic to algebra. By derivation from his name, we have the termalgorithm .Binary logic
Around the
3rd century BC , Indian mathematicianPingala discovered thebinary numeral system . In this system, still used today to process all modern computers, a sequence of ones and zeros can represent any number.In
1703 ,Gottfried Leibniz developedlogic in a formal, mathematical sense with his writings on the binary numeral system. In his system, the ones and zeros also represent "true" and "false" values or "on" and "off" states. But it took more than a century beforeGeorge Boole published his Boolean algebra in1854 with a complete system that allowed computational processes to be mathematically modeled.By this time, the first mechanical devices driven by a binary pattern had been invented. The
industrial revolution had driven forward the mechanization of many tasks, and this includedweaving .Punch cards controlledJoseph Marie Jacquard 's loom in1801 , where a hole punched in the card indicated a binary "one" and an unpunched spot indicated a binary "zero". Jacquard's loom was far from being a computer, but it did illustrate that machines could be driven by binary systems.The Analytical Engine
It wasn't until
Charles Babbage , considered the "father of computing," that the modern computer began to take shape with his work on theAnalytical Engine . The device, though never successfully built, had all of the functionality in its design of a modern computer. He first described it in1837 -- more than 100 years before any similar device was successfully constructed. The difference between Babbage's Engine and preceding devices is simple - he designed his to be "programmed".During their collaboration, mathematician
Ada Lovelace published the first evercomputer programs in a comprehensive set of notes on the analytical engine. Because of this, Lovelace is popularly considered the firstcomputer programmer , but some scholars contend that the programs published under her name were originally created by Babbage.Birth of computer science
Before the 1920s, "computers" were human clerks that performed computations. They were usually under the lead of a physicist. Many thousands of computers were employed in commerce, government, and research establishments. Most of these computers were women, and they were known to have a degree in calculus. Some performed astronomical calculations for calendars.
After the 1920s, the expression "computing machine" referred to any machine that performed the work of a human computer, especially those in accordance with effective methods of the
Church-Turing thesis . The thesis states that a mathematical method is effective if it could be set out as a list of instructions able to be followed by a human clerk with paper and pencil, for as long as necessary, and without ingenuity or insight.Machines that computed with continuous values became known as the "analog" kind. They used machinery that represented continuous numeric quantities, like the angle of a shaft rotation or difference in electrical potential.
Digital machinery, in contrast to analog, were able to render a state of a numeric value and store each individual digit. Digital machinery used difference engines or relays before the invention of faster memory devices.
The phrase "computing machine" gradually gave away, after the late 1940s, to just "computer" as the onset of electronic digital machinery became common. These computers were able to perform the calculations that were performed by the previous human clerks.
Since the values stored by digital machines were not bound to physical properties like analog devices, a logical computer, based on digital equipment, was able to do anything that could be described "purely mechanical."
Alan Turing , known as the Father of Computer Science, invented such a logical computer known as theTuring Machine , which later evolved into the modern computer. These new computers were also able to perform non-numeric computations, like music.From the time when computational processes were performed by human clerks, the study of computability began a science by being able to make evident which was not explicit into ordinary sense more immediate.
Emergence of a discipline
The theoretical groundwork
The mathematical foundations of modern computer science began to be laid by
Kurt Gödel with his incompleteness theorem (1931 ). In this theorem, he showed that there were limits to what could be proved and disproved within aformal system . This led to work by Gödel and others to define and describe these formal systems, including concepts such asmu-recursive function s andlambda-definable functions .1936 was a key year for computer science. Alan Turing andAlonzo Church independently, and also together, introduced the formalization of analgorithm , with limits on what can be computed, and a "purely mechanical" model for computing.These topics are covered by what is now called the
Church–Turing thesis , a hypothesis about the nature of mechanical calculation devices, such as electronic computers. The thesis claims that any calculation that is possible can be performed by an algorithm running on a computer, provided that sufficient time and storage space are available.Turing also included with the thesis a description of the
Turing machine . A Turing machine has an infinitely long tape and a read/write head that can move along the tape, changing the values along the way. Clearly such a machine could never be built, but nonetheless, the model can simulate the computation of any algorithm which can be performed on a modern computer.Turing is so important to computer science that his name is also featured on the
Turing Award and theTuring test . He contributed greatly to British code-breaking successes in theSecond World War , and continued to design computers and software through the1940s , but committed suicide in1954 .At a symposium on large-scale digital machinery in Cambridge, Turing said, "We are trying to build a machine to do all kinds of different things simply by programming rather than by the addition of extra apparatus".
In 1948, the first practical computer that could run stored programs, based on the Turing machine model, had been built - the Manchester Baby.
In 1950, Britain's National Physical Laboratory completed
Pilot ACE , a small scale programmable computer, based on Turing's philosophy.hannon and information theory
Up to and during the 1930s, electrical engineers were able to build electronic circuits to solve mathematical and logic problems, but most did so in an "ad hoc" manner, lacking any theoretical rigor. This changed with Claude Elwood Shannon's publication of his 1937 master's thesis,
A Symbolic Analysis of Relay and Switching Circuits . While taking an undergraduate philosophy class, Shannon had been exposed to Boole's work, and recognized that it could be used to arrange electromechanical relays (then used in telephone routing switches) to solve logic problems. This concept, of utilizing the properties of electrical switches to do logic, is the basic concept that underlies all electronic digital computers, and his thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II.Shannon went on to found the field of
information theory with his 1948 paper entitledA Mathematical Theory of Communication , which appliedprobability theory to the problem of how to best encode the information a sender wants to transmit. This work is one of the theoretical foundations for many areas of study, includingdata compression andcryptography .Wiener and Cybernetics
From experiments with anti-aircraft systems that interpreted radar images to detect enemy planes,
Norbert Wiener coined the termcybernetics from the Greek word for "steersman." He published "Cybernetics" in 1948, which influencedartificial intelligence . Wiener also comparedcomputation , computing machinery, memory devices, and other cognitive similarities with his analysis of brain waves.The first computer bug
:"Main article:
Software bug "The first actual computer bug was a moth. It was stuck in between the relays on the Harvard Mark II. [http://www.history.navy.mil/photos/images/h96000/h96566kc.htm] While the invention of the term 'bug' is often but erroneously attributed to
Grace Hopper , a rear admiral in the U.S. Navy, who supposedly logged the "bug" onSeptember 9 1945 , most other accounts conflict at least with these details. According to these accounts, the actual date wasSeptember 9 1947 when operators filed this 'incident' — along with the insect and the notation "First actual case of bug being found" (seesoftware bug for details).ee also
*
Computer science
*History of computing
*History of computing hardware
*Timeline of algorithms
*List of prominent pioneers in computer science
*List of computer term etymologies , the origins of computer science wordsNotes
References
* [http://www.cs.uwaterloo.ca/~shallit/Courses/134/history.html A Very Brief History of Computer Science]
* [http://www.computerhistory.org/ Computer History Museum]
* [http://www.eingang.org/Lecture/ Computers: From the Past to the Present]
* [http://www.history.navy.mil/photos/images/h96000/h96566kc.htm The First "Computer Bug"] at the Online Library of the Naval Historical Center, retrievedFebruary 28 2006
* [http://www.bitsavers.org/ Bitsavers] , an effort to capture, salvage, and archive historical computer software and manuals from minicomputers and mainframes of the 50s, 60s, 70s, and 80sExternal links
* [http://www.cbi.umn.edu/oh/display.phtml?id=5 Oral history interview with William F. Miller] at
Charles Babbage Institute , University of Minnesota, Minneapolis. Miller contrasts the emergence of computer science at Stanford with developments at Harvard and the University of Pennsylvania.
* [http://www.cbi.umn.edu/oh/display.phtml?id=274 Oral history interview with Alexandra Forsythe] atCharles Babbage Institute , University of Minnesota, Minneapolis. Forsythe discusses the career of her husband, George Forsythe, who established Stanford University's program in computer science.
* [http://www.cbi.umn.edu/oh/display.phtml?id=146 Oral history interview with Allen Newell] atCharles Babbage Institute , University of Minnesota, Minneapolis. Newell discusses his entry into computer science, funding for computer science departments and research, the development of the Computer Science Department at Carnegie Mellon University, including the work of Alan J. Perlis and Raj Reddy, and the growth of the computer science and artificial intelligence research communities. Compares computer science programs at Stanford, MIT, and Carnegie Mellon.
* [http://www.cbi.umn.edu/oh/display.phtml?id=271 Oral history interview with Louis Fein] atCharles Babbage Institute , University of Minnesota, Minneapolis. Fein discusses establishing computer science as an academic discipline at Stanford Research Institute (SRI) as well as contacts with the University of California--Berkeley, the University of North Carolina, Purdue, International Federation for Information Processing and other institutions.
* [http://www.cbi.umn.edu/oh/display.phtml?id=132 Oral history interview with W. Richards Adrion] atCharles Babbage Institute , University of Minnesota, Minneapolis. Adrion gives a brief history of theoretical computer science in the United States and NSF's role in funding that area during the 1970s and 1980s.
* [http://www.cbi.umn.edu/oh/display.phtml?id=153 Oral history interview with Bernard A. Galler] atCharles Babbage Institute , University of Minnesota, Minneapolis. Galler describes the development of computer science at the University of Michigan from the 1950s through the 1980s and discusses his own work in computer science.
*sep entry|computing-history|The Modern History of Computing|B. Jack Copeland
Wikimedia Foundation. 2010.