Entropy (general concept)

Entropy (general concept)

In many branches of science, entropy refers to a certain measure of the disorder of a system. Entropy is particularly notable as it has a broad, common definition that is shared across physics, mathematics and information science. Although the general notion of entropy is shared, the precise definition of this notion has to be adapted to each particular field of study. However, no matter what the field, the definition of the entropy "S" of a system always takes the basic form:

:S=-ksum_i P_i ln P_i,

where P_i is a probability, 0le P_ile 1 and "k" is a constant. The construction and interpretation of the probabilities varies from field to field. A list of these adaptations are presented below:

In thermodynamics:
* Entropy (classical thermodynamics), the macroscopic approach to thermodynamic entropy
* Entropy (statistical thermodynamics), the microscopic approach to thermodynamic entropy
* Gibbs entropy, statistical entropy of a thermodynamic system
** Boltzmann entropy, an approximation to Gibbs entropy
** Tsallis entropy, a generalization of Boltzmann-Gibbs entropy
* von Neumann entropy, entropy of a quantum-mechanical system

In information and computer science:
* Information entropy (Shannon entropy), a measure of the amount of information contained in a message
* Entropy encoding, data compression strategies
* Rényi entropy, a family of diversity measures used to define fractal dimensions

* Entropy (computing), an indicator of the number of random bits available to seed cryptography systems

In mathematics:
* Kolmogorov–Sinai entropy, the rate of information generation by a measure-preserving dynamical system
* Topological entropy, a measure of exponential growth in the number of distinguishable orbits of a dynamical system

In biology:
* Entropy (ecology), a measure of biodiversity


Wikimedia Foundation. 2010.

Игры ⚽ Нужно сделать НИР?

Look at other dictionaries:

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Entropy (energy dispersal) — The thermodynamic concept of entropy can be described qualitatively as a measure of energy dispersal (energy distribution) at a specific temperature. Changes in entropy can be quantitatively related to the distribution or the spreading out of the …   Wikipedia

  • Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the …   Wikipedia

  • Entropy and life — Much writing has been devoted to Entropy and life. Research concerning the relationship between the thermodynamic quantity entropy and the evolution of life began in around the turn of the 20th century. In 1910, American historian Henry Adams… …   Wikipedia

  • General relativity — For a generally accessible and less technical introduction to the topic, see Introduction to general relativity. General relativity Introduction Mathematical formulation Resources …   Wikipedia

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • History of entropy — The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always lost to dissipation or friction and is thus not transformed into useful work . Early heat… …   Wikipedia

  • Contributors to general relativity — General relativity Introduction Mathematical formulation Resources Fundamental concepts …   Wikipedia

  • Introduction to entropy — Thermodynamic entropy provides a measure of certain aspects of energy in relation to absolute temperature. The thermodynamic entropy S, often simply called the entropy in the context of thermodynamics, is a measure of the amount of energy in a… …   Wikipedia

  • Principle of maximum entropy — This article is about the probability theoretic principle. For the classifier in machine learning, see maximum entropy classifier. For other uses, see maximum entropy (disambiguation). Bayesian statistics Theory Bayesian probability Probability… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”