Gibbs entropy

Gibbs entropy

In thermodynamics, specifically in statistical mechanics, the Gibbs entropy formula is the standard formula for calculating the statistical mechanical entropy of a thermodynamic system,

: S = -k_ ext{B} sum_i p_i ln p_i , (1)

where "k"B is the Boltzmann constant, "p""i" is the probability of the "i"-th state and the summation is taken over the possible states of the system as a whole (typically a 6"N"-dimensional space, if the system contains "N" separate particles). An overestimation of entropy will occur if all correlations, and more generally if statistical dependence between the state probabilities are ignored. These correlations occur in systems of interacting particles, that is, in all systems more complex than an ideal gas.

The Shannon entropy formula is mathematically and conceptually equivalent to equation (1); the factor of k_B out front reflects two facts: our choice of base for the logarithm, [http://www.av8n.com/physics/thermo-laws.htm "The Laws of Thermodynamics"] including careful definitions of energy, entropy, et cetera.] and our use of an arbitrary temperature scale with water as a reference substance.

The importance of this formula is discussed at much greater length in the main article "Entropy (thermodynamics)".

This "S" is almost universally called simply the "entropy". It can also be called the "statistical entropy" or the "thermodynamic entropy" withoutchanging the meaning. The Von Neumann entropy formula is a slightly more general wayof calculating the same thing. The Boltzmann entropy formula canbe seen as a corollary of equation (1), valid under certain restrictive conditions of no statistical dependence between the states. [Jaynes, E. T. (1965). [http://bayes.wustl.edu/etj/articles/gibbs.vs.boltzmann.pdf Gibbs vs Boltzmann entropies] . "American Journal of Physics", 33, 391-8.]

= See also =

* J. Willard Gibbs
* Entropy (thermodynamics)"
* Von Neumann entropy formula
* Boltzmann entropy formula
* Shannon entropy formula

= References =


Wikimedia Foundation. 2010.

Игры ⚽ Нужен реферат?

Look at other dictionaries:

  • Gibbs — may refer to:People*Cecil Armstrong Gibbs, composer *Cory Gibbs, soccer player *Frederic A. Gibbs, neurologist *George Gibbs (mineralogist), (1776 1833) *George Gibbs (geologist), (1815 1873) *Herschelle Gibbs, South African cricketer *Humphrey… …   Wikipedia

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the …   Wikipedia

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Entropy (disambiguation) — Additional relevant articles may be found in the following categories: Thermodynamic entropy Entropy and information Quantum mechanical entropy Entropy, in thermodynamics, is a measure of the energy in a thermodynamic system not available to do… …   Wikipedia

  • Entropy (general concept) — In many branches of science, entropy refers to a certain measure of the disorder of a system. Entropy is particularly notable as it has a broad, common definition that is shared across physics, mathematics and information science. Although the… …   Wikipedia

  • Gibbs free energy — Thermodynamics …   Wikipedia

  • Gibbs paradox — In statistical mechanics, a semi classical derivation of the entropy that doesn t take into account the indistinguishability of particles, yields an expression for the entropy which is not extensive (is not proportional to the amount of substance …   Wikipedia

  • Entropy of mixing — The entropy of mixing is the change in the configuration entropy, an extensive thermodynamic quantity, when two different chemical substances or components are mixed. This entropy change must be positive since there is more uncertainty about the… …   Wikipedia

  • Entropy and life — Much writing has been devoted to Entropy and life. Research concerning the relationship between the thermodynamic quantity entropy and the evolution of life began in around the turn of the 20th century. In 1910, American historian Henry Adams… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”