- Nat (information)
-
Fundamental units
of informationA nat (sometimes also nit or nepit) is a logarithmic unit of information or entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms which define the bit. The nat is the natural unit for information entropy. Physical systems of natural units which normalize Boltzmann's constant to 1 are effectively measuring thermodynamic entropy in nats.
When the Shannon entropy is written using a natural logarithm,
it is implicitly giving a number measured in nats.
One nat corresponds to about 1.44 bits , or 0.434 bans .
History
Alan Turing used the natural ban (Hodges 1983, Alan Turing: The Enigma). Boulton and Wallace (1970) used the term nit in conjunction with minimum message length which was subsequently changed by the minimum description length community to nat to avoid confusion with the nit used as a unit of luminance (Comley and Dowe, 2005, sec. 11.4.1, p271).
References
- Comley, J. W. & Dowe, D. L. (2005). "Minimum Message Length, MDL and Generalised Bayesian Networks with Asymmetric Languages". In Grünwald, P.; Myung, I. J. & Pitt, M. A.. Advances in Minimum Description Length: Theory and Applications. Cambridge: MIT Press. ISBN 0262072629. http://www.csse.monash.edu.au/~dld/David.Dowe.publications.html#ComleyDowe2005.
- Reza, Fazlollah M. (1994). An Introduction to Information Theory. New York: Dover. ISBN 0486682102.
Categories:
Wikimedia Foundation. 2010.