- Ban (information)
-
Fundamental units
of informationA ban, sometimes called a hartley (symbol Hart) or a dit (short for decimal digit), is a logarithmic unit which measures information or entropy, based on base 10 logarithms and powers of 10, rather than the powers of 2 and base 2 logarithms which define the bit. As a bit corresponds to a binary digit, so a ban is a decimal digit. A deciban is one tenth of a ban, analogous to a decibel.
One ban corresponds to about 3.32 bits (log2(10)), or 2.30 nats (ln(10)). A deciban is about 0.33 bits.
Contents
History
The ban and the deciban were invented by Alan Turing with I. J. Good in 1940, to measure the amount of information that could be deduced by the codebreakers at Bletchley Park using the Banburismus procedure, towards determining each day's unknown setting of the German naval Enigma cipher machine. The name was inspired by the enormous sheets of card, printed in the town of Banbury about 30 miles away, that were used in the process.[1]
Jack Good argued that the sequential summation of decibans to build up a measure of the weight of evidence in favour of a hypothesis, is essentially Bayesian inference.[1] Donald A. Gillies, however, argued the ban is, in effect, the same as Karl Popper's measure of the severity of a test.[2]
The term hartley is after Ralph Hartley, who suggested this unit in 1928.[3][4]
The units pre-date Shannon's bit by at least eight years.
Usage as a unit of probability
The deciban is a particularly useful measure of information in odds-ratios or weights of evidence. 10 decibans corresponds to an odds ratio of 10:1; 20 decibans to 100:1 odds, etc. According to I. J. Good, a change in a weight of evidence of 1 deciban (i.e., a change in an odds ratio from evens to about 5:4), or perhaps half a deciban, is about as finely as humans can reasonably be expected to quantify their degree of belief in a hypothesis.[1]
References
- ^ a b c Good, I.J. (1979). "Studies in the History of Probability and Statistics. XXXVII A. M. Turing's statistical work in World War II". Biometrika 66 (2): 393–396. doi:10.1093/biomet/66.2.393. MR0548210.
- ^ Gillies, Donald A. (1990). "The Turing-Good Weight of Evidence Function and Popper's Measure of the Severity of a Test". British Journal for the Philosophy of Science 41 (1): 143–146. doi:10.1093/bjps/41.1.143. JSTOR 688010. MR055678.
- ^ Hartley, R.V.L. (July 1928). "Transmission of Information". Bell System Technical Journal VII (3): 535–563. http://dotrose.com/etext/90_Miscellaneous/transmission_of_information_1928b.pdf. Retrieved 2008-03-27.
- ^ Reza, Fazlollah M. An Introduction to Information Theory. New York: Dover, 1994. ISBN 0-486-68210-2
Further reading
- David J. C. MacKay. Information Theory, Inference, and Learning Algorithms Cambridge: Cambridge University Press, 2003. ISBN 0-521-64298-1. This on-line textbook includes a chapter on the units of information content, and the game of Banburismus that the codebreakers played when cracking each day's Enigma codes.
Categories:
Wikimedia Foundation. 2010.