History of computer science

History of computer science

The history of computer science began long before the modern discipline of computer science that emerged in the twentieth century. The progression, from mechanical inventions and mathematical theories towards the modern concepts and machines, formed a major academic field and the basis of a massive world-wide industry.

Early history

Early computation

The earliest known tool for use in computation was the abacus, and it was thought to have been invented in Babylon circa 2400 BCE. Its original style of usage was by lines drawn in sand with pebbles. This was the first known computer and most advanced system of calculation known to date - preceding Greek methods by 2,000 years. Abaci of a more modern design are still used as calculation tools today.

In 1115 BCE, the South Pointing Chariot was invented in ancient China. It was the first known geared mechanism to use a differential gear, which was later used in analog computers. The Chinese also invented a more sophisticated abacus from around the 2nd century BCE, known as the Chinese abacus.

In the 5th century BCE in ancient India, the grammarian Pāṇini formulated the grammar of Sanskrit in 3959 rules known as the Ashtadhyayi which was highly systematized and technical. Panini used metarules, transformations and recursions with such sophistication that his grammar had the computing power equivalent to a Turing machine. Between 200 BCE and 400 CE, Jaina mathematicians in India invented the logarithm. From the 13th century, logarithmic tables were produced by Muslim mathematicians.

The Antikythera mechanism is believed to be the earliest known mechanical analog computer. [ [http://www.antikythera-mechanism.gr/project/general/the-project.html "The Antikythera Mechanism Research Project"] , The Antikythera Mechanism Research Project. Retrieved 2007-07-01] It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to "circa" 100 BC.

Mechanical analog computer devices appeared again a thousand years later in the medieval Islamic world and were developed by Muslim astronomers, such as the equatorium by Arzachel, [Harvard reference |last=Hassan |first=Ahmad Y. |authorlink=Ahmad Y Hassan |url=http://www.history-science-technology.com/Articles/articles%2071.htm |title=Transfer Of Islamic Technology To The West, Part II: Transmission Of Islamic Engineering |accessdate=2008-01-22] the mechanical geared astrolabe by Abū Rayhān al-Bīrūnī,cite web|url=http://www.usc.edu/dept/MSA/introduction/woi_knowledge.html|title=Islam, Knowledge, and Science|publisher=University of Southern California|accessdate=2008-01-22] and the torquetum by Jabir ibn Aflah. [citation|first=R. P.|last=Lorch|title=The Astronomical Instruments of Jabir ibn Aflah and the Torquetum|journal=Centaurus|volume=20|issue=1|year=1976|pages=11-34] The first programmable machines were also invented by Muslim engineers, such as the automatic flute player by the Banū Mūsā brothersTeun Koetsier (2001). "On the prehistory of programmable machines: musical automata, looms, calculators", "Mechanism and Machine theory" 36, p. 590-591.] and the humanoid robots by Al-Jazari. [ [http://www.shef.ac.uk/marcoms/eview/articles58/robot.html A 13th Century Programmable Robot] , University of Sheffield] Muslim mathematicians also made important advances in cryptography, such as the development of cryptanalysis and frequency analysis by Alkindus. [Simon Singh, "The Code Book", pp. 14-20] [cite web |url=http://www.muslimheritage.com/topics/default.cfm?ArticleID=372 |title= Al-Kindi, Cryptgraphy, Codebreaking and Ciphers |accessdate=2007-01-12 |format= HTML |work= ]

When John Napier discovered logarithms for computational purposes in the early 17th century, there followed a period of considerable progress by inventors and scientists in making calculating tools.

None of the early computational devices were really computers in the modern sense, and it took considerable advancement in mathematics and theory before the first modern computers could be designed.

Algorithms

In the 7th century, Indian mathematician Brahmagupta gave the first explanation of the Hindu-Arabic numeral system and the use of zero as both a placeholder and a decimal digit.

Approximately around the year 825, Persian mathematician Al-Khwarizmi wrote a book, "On the Calculation with Hindu Numerals", that was principally responsible for the diffusion of the Indian system of numeration in the Middle East and then Europe. Around the 12th century, there was translation of this book written into Latin: "Algoritmi de numero Indorum". These books presented newer concepts to perform a series of steps in order to accomplish a task such as the systematic application of arithmetic to algebra. By derivation from his name, we have the term algorithm.

Binary logic

Around the 3rd century BC, Indian mathematician Pingala discovered the binary numeral system. In this system, still used today to process all modern computers, a sequence of ones and zeros can represent any number.

In 1703, Gottfried Leibniz developed logic in a formal, mathematical sense with his writings on the binary numeral system. In his system, the ones and zeros also represent "true" and "false" values or "on" and "off" states. But it took more than a century before George Boole published his Boolean algebra in 1854 with a complete system that allowed computational processes to be mathematically modeled.

By this time, the first mechanical devices driven by a binary pattern had been invented. The industrial revolution had driven forward the mechanization of many tasks, and this included weaving. Punch cards controlled Joseph Marie Jacquard's loom in 1801, where a hole punched in the card indicated a binary "one" and an unpunched spot indicated a binary "zero". Jacquard's loom was far from being a computer, but it did illustrate that machines could be driven by binary systems.

The Analytical Engine

It wasn't until Charles Babbage, considered the "father of computing," that the modern computer began to take shape with his work on the Analytical Engine. The device, though never successfully built, had all of the functionality in its design of a modern computer. He first described it in 1837 -- more than 100 years before any similar device was successfully constructed. The difference between Babbage's Engine and preceding devices is simple - he designed his to be "programmed".

During their collaboration, mathematician Ada Lovelace published the first ever computer programs in a comprehensive set of notes on the analytical engine. Because of this, Lovelace is popularly considered the first computer programmer, but some scholars contend that the programs published under her name were originally created by Babbage.

Birth of computer science

Before the 1920s, "computers" were human clerks that performed computations. They were usually under the lead of a physicist. Many thousands of computers were employed in commerce, government, and research establishments. Most of these computers were women, and they were known to have a degree in calculus. Some performed astronomical calculations for calendars.

After the 1920s, the expression "computing machine" referred to any machine that performed the work of a human computer, especially those in accordance with effective methods of the Church-Turing thesis. The thesis states that a mathematical method is effective if it could be set out as a list of instructions able to be followed by a human clerk with paper and pencil, for as long as necessary, and without ingenuity or insight.

Machines that computed with continuous values became known as the "analog" kind. They used machinery that represented continuous numeric quantities, like the angle of a shaft rotation or difference in electrical potential.

Digital machinery, in contrast to analog, were able to render a state of a numeric value and store each individual digit. Digital machinery used difference engines or relays before the invention of faster memory devices.

The phrase "computing machine" gradually gave away, after the late 1940s, to just "computer" as the onset of electronic digital machinery became common. These computers were able to perform the calculations that were performed by the previous human clerks.

Since the values stored by digital machines were not bound to physical properties like analog devices, a logical computer, based on digital equipment, was able to do anything that could be described "purely mechanical." Alan Turing, known as the Father of Computer Science, invented such a logical computer known as the Turing Machine, which later evolved into the modern computer. These new computers were also able to perform non-numeric computations, like music.

From the time when computational processes were performed by human clerks, the study of computability began a science by being able to make evident which was not explicit into ordinary sense more immediate.

Emergence of a discipline

The theoretical groundwork

The mathematical foundations of modern computer science began to be laid by Kurt Gödel with his incompleteness theorem (1931). In this theorem, he showed that there were limits to what could be proved and disproved within a formal system. This led to work by Gödel and others to define and describe these formal systems, including concepts such as mu-recursive functions and lambda-definable functions.

1936 was a key year for computer science. Alan Turing and Alonzo Church independently, and also together, introduced the formalization of an algorithm, with limits on what can be computed, and a "purely mechanical" model for computing.

These topics are covered by what is now called the Church–Turing thesis, a hypothesis about the nature of mechanical calculation devices, such as electronic computers. The thesis claims that any calculation that is possible can be performed by an algorithm running on a computer, provided that sufficient time and storage space are available.

Turing also included with the thesis a description of the Turing machine. A Turing machine has an infinitely long tape and a read/write head that can move along the tape, changing the values along the way. Clearly such a machine could never be built, but nonetheless, the model can simulate the computation of any algorithm which can be performed on a modern computer.

Turing is so important to computer science that his name is also featured on the Turing Award and the Turing test. He contributed greatly to British code-breaking successes in the Second World War, and continued to design computers and software through the 1940s, but committed suicide in 1954.

At a symposium on large-scale digital machinery in Cambridge, Turing said, "We are trying to build a machine to do all kinds of different things simply by programming rather than by the addition of extra apparatus".

In 1948, the first practical computer that could run stored programs, based on the Turing machine model, had been built - the Manchester Baby.

In 1950, Britain's National Physical Laboratory completed Pilot ACE, a small scale programmable computer, based on Turing's philosophy.

hannon and information theory

Up to and during the 1930s, electrical engineers were able to build electronic circuits to solve mathematical and logic problems, but most did so in an "ad hoc" manner, lacking any theoretical rigor. This changed with Claude Elwood Shannon's publication of his 1937 master's thesis, A Symbolic Analysis of Relay and Switching Circuits. While taking an undergraduate philosophy class, Shannon had been exposed to Boole's work, and recognized that it could be used to arrange electromechanical relays (then used in telephone routing switches) to solve logic problems. This concept, of utilizing the properties of electrical switches to do logic, is the basic concept that underlies all electronic digital computers, and his thesis became the foundation of practical digital circuit design when it became widely known among the electrical engineering community during and after World War II.

Shannon went on to found the field of information theory with his 1948 paper entitled A Mathematical Theory of Communication, which applied probability theory to the problem of how to best encode the information a sender wants to transmit. This work is one of the theoretical foundations for many areas of study, including data compression and cryptography.

Wiener and Cybernetics

From experiments with anti-aircraft systems that interpreted radar images to detect enemy planes, Norbert Wiener coined the term cybernetics from the Greek word for "steersman." He published "Cybernetics" in 1948, which influenced artificial intelligence. Wiener also compared computation, computing machinery, memory devices, and other cognitive similarities with his analysis of brain waves.

The first computer bug

:"Main article: Software bug"

The first actual computer bug was a moth. It was stuck in between the relays on the Harvard Mark II. [http://www.history.navy.mil/photos/images/h96000/h96566kc.htm] While the invention of the term 'bug' is often but erroneously attributed to Grace Hopper, a rear admiral in the U.S. Navy, who supposedly logged the "bug" on September 9 1945, most other accounts conflict at least with these details. According to these accounts, the actual date was September 9 1947 when operators filed this 'incident' — along with the insect and the notation "First actual case of bug being found" (see software bug for details).

ee also

* Computer science
* History of computing
* History of computing hardware
* Timeline of algorithms
* List of prominent pioneers in computer science
* List of computer term etymologies, the origins of computer science words

Notes

References

* [http://www.cs.uwaterloo.ca/~shallit/Courses/134/history.html A Very Brief History of Computer Science]
* [http://www.computerhistory.org/ Computer History Museum]
* [http://www.eingang.org/Lecture/ Computers: From the Past to the Present]
* [http://www.history.navy.mil/photos/images/h96000/h96566kc.htm The First "Computer Bug"] at the Online Library of the Naval Historical Center, retrieved February 28 2006
* [http://www.bitsavers.org/ Bitsavers] , an effort to capture, salvage, and archive historical computer software and manuals from minicomputers and mainframes of the 50s, 60s, 70s, and 80s

External links

* [http://www.cbi.umn.edu/oh/display.phtml?id=5 Oral history interview with William F. Miller] at Charles Babbage Institute, University of Minnesota, Minneapolis. Miller contrasts the emergence of computer science at Stanford with developments at Harvard and the University of Pennsylvania.
* [http://www.cbi.umn.edu/oh/display.phtml?id=274 Oral history interview with Alexandra Forsythe] at Charles Babbage Institute, University of Minnesota, Minneapolis. Forsythe discusses the career of her husband, George Forsythe, who established Stanford University's program in computer science.
* [http://www.cbi.umn.edu/oh/display.phtml?id=146 Oral history interview with Allen Newell] at Charles Babbage Institute, University of Minnesota, Minneapolis. Newell discusses his entry into computer science, funding for computer science departments and research, the development of the Computer Science Department at Carnegie Mellon University, including the work of Alan J. Perlis and Raj Reddy, and the growth of the computer science and artificial intelligence research communities. Compares computer science programs at Stanford, MIT, and Carnegie Mellon.
* [http://www.cbi.umn.edu/oh/display.phtml?id=271 Oral history interview with Louis Fein] at Charles Babbage Institute, University of Minnesota, Minneapolis. Fein discusses establishing computer science as an academic discipline at Stanford Research Institute (SRI) as well as contacts with the University of California--Berkeley, the University of North Carolina, Purdue, International Federation for Information Processing and other institutions.
* [http://www.cbi.umn.edu/oh/display.phtml?id=132 Oral history interview with W. Richards Adrion] at Charles Babbage Institute, University of Minnesota, Minneapolis. Adrion gives a brief history of theoretical computer science in the United States and NSF's role in funding that area during the 1970s and 1980s.
* [http://www.cbi.umn.edu/oh/display.phtml?id=153 Oral history interview with Bernard A. Galler] at Charles Babbage Institute, University of Minnesota, Minneapolis. Galler describes the development of computer science at the University of Michigan from the 1950s through the 1980s and discusses his own work in computer science.
*sep entry|computing-history|The Modern History of Computing|B. Jack Copeland


Wikimedia Foundation. 2010.

Игры ⚽ Нужно решить контрольную?

Look at other dictionaries:

  • Computer science — or computing science (abbreviated CS) is the study of the theoretical foundations of information and computation and of practical techniques for their implementation and application in computer systems. Computer scientists invent algorithmic… …   Wikipedia

  • Computer science in sport — is an interdisciplinary discipline that has its goal in combining the theoretical as well as practical aspects and methods of the areas of informatics and sport science. The main emphasis of the interdisciplinarity is placed on the application… …   Wikipedia

  • Computer graphics (computer science) — This article is about the scientific discipline of computer graphics. For other uses see Computer graphics (disambiguation). A modern render of the Utah teapot, an iconic model in 3D computer graphics created by Martin Newell in 1975. Computer… …   Wikipedia

  • History of agricultural science — Agronomy and the related disciplines of agricultural science today are very different from what they were before about 1950. Intensification of agriculture since the 1960s in developed and developing countries, often referred to as the Green… …   Wikipedia

  • computer science — computer scientist. the science that deals with the theory and methods of processing information in digital computers, the design of computer hardware and software, and the applications of computers. [1970 75] * * * Study of computers, their… …   Universalium

  • Computer Science (journal) — Computer Science   Abbreviated title ( …   Wikipedia

  • Outline of computer science — The following outline is provided as an overview of and topical guide to computer science: Computer science (also called computing science) – study of the theoretical foundations of information and computation and their implementation and… …   Wikipedia

  • Portal:Computer science — Wikipedia portals: Culture Geography Health History Mathematics Natural sciences People Philosophy Religion Society Technology …   Wikipedia

  • Topic outline of computer science — Computer science, or computing science, is the study of the theoretical foundations of information and computation and their implementation and application in computer systems. One well known subject classification system for computer science is… …   Wikipedia

  • List of important publications in computer science — This is a list of important publications in computer science, organized by field. Some reasons why a particular publication might be regarded as important: Topic creator – A publication that created a new topic Breakthrough – A publication that… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”