Boltzmann's entropy formula

Boltzmann's entropy formula

In statistical thermodynamics, Boltzmann's equation is a probability equation relating the entropy "S" of an ideal gas to the quantity "W", which is the number of microstates corresponding to a given macrostate:

:S = k log W ! (1)

where "k" is Boltzmann's constant equal to 1.38062 x 10-23 joule/kelvin and "W" is the number of microstates consistent with the given macrostate. In short, the Boltzmann formula shows the relationship between entropy and the number of ways the atoms or molecules of a thermodynamic system can be arranged. In 1934, Swiss physical chemist Werner Kuhn successfully derived a thermal equation of state for rubber molecules using Boltzmann's formula, which has since come to be known as the entropy model of rubber.

History

The equation was originally formulated by Ludwig Boltzmann between 1872 to 1875, but later put into its current form by Max Planck in about 1900. [ [http://scienceworld.wolfram.com/physics/BoltzmannEquation.html Boltzmann equation] – Eric Weisstein’s World of Physics (states the year was 1872)] [cite book | author=Perrot, Pierre | title=A to Z of Thermodynamics | publisher=Oxford University Press | year=1998 | id=ISBN 0-19-856552-6 (states the year was 1875)] To quote Planck, "the logarithmic connection between entropy and probability was first stated by L. Boltzmann in his kinetic theory of gases."

The value of W, specifically, is the "Wahrscheinlichkeit", or number of possible microstates corresponding to the macroscopic state of a system — number of (unobservable) "ways" the (observable) thermodynamic state of a system can be realized by assigning different positions and momenta to the various molecules. Boltzmann’s paradigm was an ideal gas of N "identical" particles, of which N_i are in the i-th microscopic condition (range) of position and momentum. W can be counted using the formula for permutations

: W = N!; / ; prod_i N_i! (2)

where "i" ranges over all possible molecular conditions and ! denotes factorial. The "correction" in the denominator is due to the fact that identical particles in the same condition are indistinguishable. W is sometimes called the "thermodynamic probability" since it is an integer greater than one, while mathematical probabilities are always numbers between zero and one.

Generalization

Boltzmann's formula applies to microstates of the universe as a whole, each possible microstate of which is presumed to be equally probable.

But in thermodynamics it is important to be able to make the approximation of dividing the universe into a system of interest, plus its surroundings; and then to be able to identify the entropy of the system with the system entropy in Classical thermodynamics. The microstates of such a thermodynamic system are "not" equally probable—for example, high energy microstates are less probable than low energy microstates for a thermodynamic system kept at a fixed temperature by allowing contact with a heat bath.

For thermodynamic systems where microstates of the system may not have equal probabilities, the appropriate generalization, called the Gibbs entropy, is:

: S = - k sum p_i log p_i (3)

This reduces to equation (1) if the probabilities "p"i are all equal.

Boltzmann used a holog ho formula as early as 1866.cite journal | author=Ludwig Boltzmann | year = 1866 | title=Über die Mechanische Bedeutung des Zweiten Hauptsatzes der Wärmetheorie | journal=Wiener Berichte | volume=53 | pages=195–220] He interpreted ho as a density in phase space—without mentioning probability—but since this satisfies the axiomatic definition of a probability measure we can retrospectively interpret it as a probability anyway. Gibbs gave an explicitly probabilistic interpretation in 1878.

Boltzmann himself used an expression equivalent to (3) in his later workcite book | author=Ludwig Boltzmann | title=Vorlesungen über Gastheorie | publisher=J.A. Barth, Leipzig | year=1896 and 1898] and recognized it as more general than equation (1). That is, equation (1) is a corollary ofequation (3)—and not vice versa. In every situation where equation (1) is valid,equation (3) is valid also—and not vice versa.

Boltzmann entropy excludes statistical dependencies

The term Boltzmann entropy is also sometimes used to indicate entropies calculated based on the approximation that the overall probability can be factored into an identical separate term for each particle -- i.e., assuming each particle has an identical independent probability distribution, and ignoring interactions and correlations between the particles. This is exact for an ideal gas of identical particles, and may or may not be a good approximation for other systems. [Jaynes, E. T. (1965). [http://bayes.wustl.edu/etj/articles/gibbs.vs.boltzmann.pdf Gibbs vs Boltzmann entropies] . "American Journal of Physics", 33, 391-8.]

ee also

*History of entropy
*Gibbs entropy

References

External links

* [http://www.chemsoc.org/exemplarchem/entries/pkirby/exemchem/Boltzmann/Boltzmann.html Introduction to Boltzmann's Equation]


Wikimedia Foundation. 2010.

Игры ⚽ Поможем решить контрольную работу

Look at other dictionaries:

  • Boltzmann entropy — In thermodynamics, specifically in statistical mechanics, the Boltzmann entropy is an approximation to the normal Gibbs entropy.The Boltzmann entropy is obtained if one assumes one can treat all the component particles of a thermodynamic system… …   Wikipedia

  • Boltzmann equation — For other uses, see Boltzmann s entropy formula, Stefan–Boltzmann law and Maxwell–Boltzmann distribution The Boltzmann equation, also often known as the Boltzmann transport equation, devised by Ludwig Boltzmann, describes the statistical… …   Wikipedia

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Boltzmann constant — For the constant pertaining to energy of black body radiation see Stefan–Boltzmann constant Values of k[1] Units 1.3806488(13)×10−23 J K−1 8.617332 …   Wikipedia

  • Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the …   Wikipedia

  • Entropy (order and disorder) — Boltzmann s molecules (1896) shown at a rest position in a solid In thermodynamics, entropy is commonly associated with the amount of order, disorder, and/or chaos in a thermodynamic system. This stems from Rudolf Clausius 1862 assertion that any …   Wikipedia

  • Configuration entropy — In statistical mechanics, configuration entropy is the portion of a system s entropy that is related to the position of its constituent particles rather than to their velocity or momentum. It is physically related to the number of ways of… …   Wikipedia

  • Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

  • Entropy (classical thermodynamics) — In thermodynamics, entropy is a measure of how close a thermodynamic system is to equilibrium. A thermodynamic system is any physical object or region of space that can be described by its thermodynamic quantities such as temperature, pressure,… …   Wikipedia

  • Gibbs entropy — In thermodynamics, specifically in statistical mechanics, the Gibbs entropy formula is the standard formula for calculating the statistical mechanical entropy of a thermodynamic system,: S = k ext{B} sum i p i ln p i , (1)where k B is the… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”