History of information theory

History of information theory

The decisive event which established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the "Bell System Technical Journal" in July and October of 1948.

In this revolutionary and groundbreaking paper, the work for which Shannon had substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that :"The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point."

With it came the ideas of
* the information entropy and redundancy of a source, and its relevance through the source coding theorem;
* the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem;
* the practical result of the Shannon–Hartley law for the channel capacity of a Gaussian channel; and of course
* the bit - a new way of seeing the most fundamental unit of information.

Before 1948

Early telecommunications

Some of the oldest methods of instant telecommunications implicitly use many of the ideas that would later be quantified in information theory. Modern telegraphy, starting in the 1830s, used Morse code, in which more common letters (like "E," which is expressed as one "dot") are transmitted more quickly than less common letters (like "J," which is expressed by one "dot" followed by three "dashes"). The idea of encoding information in this manner is the cornerstone of lossless data compression. A hundred years later, frequency modulation illustrated that bandwidth can be considered merely another degree of freedom. The vocoder, now largely looked at as an audio engineering curiosity, was originally designed in 1939 to use less bandwidth than that of an original message, in much the same way that mobile phones now trade off voice quality with bandwidth.

Quantitative ideas of information

The most direct antecedents of Shannon's work were two papers published in the 1920s by Harry Nyquist and Ralph Hartley, who were both still research leaders at Bell Labs when Shannon arrived in the early 1940s.

Nyquist's 1924 paper, "Certain Factors Affecting Telegraph Speed" is mostly concerned with some detailed engineering aspects of telegraph signals. But a more theoretical section discusses quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation

:W = K log m ,

where "W" is the speed of transmission of intelligence, "m" is the number of different voltage levels to choose from at each time step, and "K" is a constant.

Hartley's 1928 paper, called simply "Transmission of Information", went further by using the word "information" (in a technical sense), and making explicitly clear that information in this context was a measurable quantity, reflecting only the receiver's ability to distinguish that one sequence of symbols had been intended by the sender rather than any other -- quite regardless of any associated meaning or other psychological or semantic aspect the symbols might represent. This amount of information he quantified as

:H = log S^n ,

where "S" was the number of possible symbols, and "n" the number of symbols in a transmission. The natural unit of information was therefore the decimal digit, much later renamed the hartley in his honour as a unit or scale or measure of information. The Hartley information, "H"0, is still used as a quantity for the logarithm of the total number of possibilities.

A similar unit of log10 probability, the "ban", and its derived unit the deciban (one tenth of a ban), were introduced by Alan Turing in 1940 as part of the statistical analysis of the breaking of the German second world war Enigma cyphers. The "decibannage" represented the reduction in (the logarithm of) the total number of possibilities (similar to the change in the Hartley information); and also the log-likelihood ratio (or change in the weight of evidence) that could be inferred for one hypothesis over another from a set of observations. The expected change in the weight of evidence is equivalent to what was later called the Kullback discrimination information.

But underlying this notion was still the idea of equal a-priori probabilities, rather than the information content of events of unequal probability; nor yet any underlying picture of questions regarding the communication of such varied outcomes.

Entropy in statistical mechanics

One area where unequal probabilities were indeed well known was statistical mechanics, where Ludwig Boltzmann had, in the context of his H-theorem of 1872, first introduced the quantity

: H = - sum f_i log f_i

as a measure of the breadth of the spread of states available to a single particle in a gas of like particles, where "f" represented the relative frequency distribution of each possible state. Boltzmann argued mathematically that the effect of collisions between the particles would cause the "H"-function to inevitably increase from any initial configuration until equilibrium was reached; and further identified it as an underlying microscopic rationale for the macroscopic thermodynamic entropy of Clausius.

(The "H"-theorem of Boltzmann subsequently led to no end of controversy; and can still cause lively debates to the present day, often aggravated by protagonists not realising that they are arguing at cross-purposes. The theorem relies on a hidden assumption, to wit, that useful information is destroyed by the collisions, which can be questioned. Also, it relies on a non-equilibrium state being singled out as the initial state (not the final state), which breaks time symmetry. Also, strictly, it applies only in a statistical sense, namely that an average "H"-function would be non-decreasing).

Boltzmann's definition was soon reworked by the American mathematical physicist J. Willard Gibbs into a general formula for statistical-mechanical entropy, no longer requiring identical and non-interacting particles, but instead based on the probability distribution "pi" for the complete microstate "i" of the total system:

: S = -k_B sum p_i ln p_i ,

This (Gibbs) entropy, from statistical mechanics, can be found to directly correspond to the Clausius's classical thermodynamic definition.

Shannon himself was apparently not particularly aware of the close similarity between his new measure and earlier work in thermodynamics, but John von Neumann was. It is said that, when Shannon was deciding what to call his new measure and fearing the term 'information' was already over-used, von Neumann told him firmly: "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage."

(Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored further in the article "Entropy in thermodynamics and information theory").

Development since 1948

The publication of Shannon's 1948 paper, "A Mathematical Theory of Communication", in the "Bell System Technical Journal" was the founding of information theory as we know it today. Many developments and applications of the theory have taken place since then, which have made many modern devices for data communication and storage such as CD-ROMs and mobile phones possible. Notable developments are listed in a timeline of information theory.

See also

* Timeline of information theory
* Shannon, C.E.
* Hartley, R.V.L.


Wikimedia Foundation. 2010.

Игры ⚽ Поможем решить контрольную работу

Look at other dictionaries:

  • Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… …   Wikipedia

  • History of knot theory — For thousands of years, knots have been used for basic purposes such as recording information, fastening and tying objects together. Over time people realized that different knots were better at different tasks, such as climbing or sailing. Knots …   Wikipedia

  • History of string theory — 1943 1958: S Matrix String theory is an outgrowth of a research program begun by Werner Heisenberg in 1943, picked up and advocated by many prominent theorists starting in the late 1950s and throughout the 1960s, which was discarded and… …   Wikipedia

  • Algorithmic information theory — is a subfield of information theory and computer science that concerns itself with the relationship between computation and information. According to Gregory Chaitin, it is the result of putting Shannon s information theory and Turing s… …   Wikipedia

  • IEEE Transactions on Information Theory — Infobox Journal discipline = Electrical engineering, Computer science and Communications abbreviation = IT, TIT website = http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?puNumber=18 publisher = Institute of Electrical and Electronics Engineers… …   Wikipedia

  • History of science — History of science …   Wikipedia

  • History of chemistry — History of science …   Wikipedia

  • Information foraging — is a theory that applies the ideas from optimal foraging theory to understand how human users search for information. The theory is based on the assumption that, when searching for information, humans use built in foraging mechanisms that evolved …   Wikipedia

  • Information communication technology — Information and Communications Technology or technologies (ICT) is an umbrella term that includes all technologies for the manipulation and communication of information. The term is sometimes used in preference to Information Technology (IT),… …   Wikipedia

  • History of mathematics — A proof from Euclid s Elements, widely considered the most influential textbook of all time.[1] …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”