Gibbs paradox

Gibbs paradox

In statistical mechanics, a semi-classical derivation of the entropy that doesn't take into account the indistinguishability of particles, yields an expression for the entropy which is not extensive (is not proportional to the amount of substance in question). This leads to an apparent paradox known as the Gibbs paradox, allowing, for instance, the entropy of closed systems to decrease, violating the second law of thermodynamics. It is possible, however, to take the perspective that it is merely the definition of entropy that is changed to ignore particle permutation (and thereby avert the paradox).

Contents

Illustration of the problem

Gibbs himself considered the following problem that arises if the ideal gas entropy is not extensive.[1] Two identical containers of an ideal gas sit side-by-side. There is a certain amount of entropy ("S") associated with each container of gas, and this depends on the volume of each container. Now a door in the container walls is opened to allow the gas particles to mix between the containers. No macroscopic changes occur, as the system is in equilibrium. The entropy of the gas in the two-container system could be immediately calculated, but if the equation is not extensive, the entropy would not be 2*S. In fact, Gibbs' non-extensive entropy equation would predict additional entropy. Closing the door then reduces the entropy again to 2*S, in supposed violation of the Second Law of Thermodynamics.

As understood by Gibbs,[2] and reemphasized more recently,[3][4] this is a misuse of the entropy equation. If the gas particles are distinguishable, closing the doors will not return the system to its original state - many of the particles will have switched containers. There is a freedom in what is defined as ordered, and it would be a mistake to conclude the entropy had not increased. In particular, Gibbs' non-extensive entropy equation of an ideal gas was not intended for varying numbers of particles.

The paradox is averted by concluding the indistinguishability (at least effective indistinguishability) of the particles in the volume. This results in the extensive Sackur-Tetrode equation for entropy, as derived next.

Calculating the entropy of ideal gas, and making it extensive

In classical mechanics, the state of an ideal gas of energy U, volume V and with N particles, each particle having mass m, is represented by specifying the momentum vector p and the position vector x for each particle. This can be thought of as specifying a point in a 6N-dimensional phase space, where each of the axes corresponds to one of the momentum or position coordinates of one of the particles. The set of points in phase space that the gas could occupy is specified by the constraint that the gas will have a particular energy:

U=\frac{1}{2m}\sum_{i=1}^{N}  p_{ix}^2 + p_{iy}^2 + p_{iz}^2

and be contained inside of the volume V (let's say V is a box of side X so that V=X3):

0 \le x_{ij} \le X

for :i = 1...N and j = 1,2,3

The first constraint defines the surface of a 3N-dimensional hypersphere of radius (2mU)1/2 and the second is a 3N-dimensional hypercube of volume VN. These combine to form a 6N-dimensional hypercylinder. Just as the area of the wall of a cylinder is the circumference of the base times the height, so the area φ of the wall of this hypercylinder is:


\phi(U,V,N) = V^N \left(\frac{2\pi^{\frac{3N}{2}}(2mU)^{\frac{3N-1}{2}}}{\Gamma(3N/2)}\right)~~~~~~~~~~~(1)

The entropy is proportional to the logarithm of the number of states that the gas could have while satisfying these constraints. In classical physics, the number of states is infinitely large, but according to quantum mechanics it is finite. Before the advent of quantum mechanics, this infinity was regularized by making phase space discrete. Phase pace was divided up in blocks of volume h3N. The constant h thus appeared as a result of a mathematical trick and thought to have no physical significance. However, using quantum mechanics one recovers the same formalism in the semi classical limit, but now with h being Planck's constant. One can qualitatively see this from Heisenberg's uncertainty principle; a volume in N phase space smaller than h3N (h is Planck's constant) cannot be specified.

To compute the number of states we must compute the volume in phase space where the system can be found in and divide that by h3N. This leads us to another problem, the volume seems to zero as the region in phase space the system can be in, is an area of zero thickness. This problem is an artifact of having specified the energy U with infinite accuracy. In a generic system without symmetries, a full quantum treatment would yield a discrete non-degenerate set of energy eigenstates. An exact specification of the energy would then fix the precise state the system is in, so the number of states available to the system would be one, the entropy would thus be zero.

When we specify the internal energy to be U, what we really mean is that the total energy of the gas lies somewhere in an interval of length δU around U. Here δU is taken to be very small, it turns out that the entropy doesn't depend strongly on the choice of δU for large N. This means that the above "area" ϕ must be extended to a shell of a thickness equal to an uncertainty in momentum \delta p = \delta\left(\sqrt{2 m U}\right) = \sqrt{\frac{m}{2 U}}\delta U, so the entropy is given by:

\left.\right.
S=k\,\ln(\phi \delta p/h^{3N})

where the constant of proportionality is k, Boltzmann's constant. Using Stirling's approximation for the Gamma function which omits terms of less than order N, the entropy for large N becomes:


S = k N \ln
\left[ V  \left(\frac UN \right)^{\frac 32}\right]+
{\frac 32}kN\left( 1+ \ln\frac{4\pi m}{3h^2}\right)

This quantity is not extensive as can be seen by considering two identical volumes with the same particle number and the same energy. Suppose the two volumes are separated by a barrier in the beginning. Removing or reinserting the wall is reversible, but the entropy difference after removing the barrier is


\delta S = k \left[ 2N \ln(2V) - N\ln V - N \ln V \right] = 2 k N \ln 2 > 0

which is in contradiction to thermodynamics. This is the Gibbs paradox.

The paradox is resolved by postulating that the gas particles are in fact indistinguishable. This means that all states that differ only by a permutation of particles should be considered as the same state. For example, if we have a 2-particle gas and we specify AB as a state of the gas where the first particle (A) has momentum p1 and the second particle (B) has momentum p2, then this state as well as the BA state where the B particle has momentum p1 and the A particle has momentum p2 should be counted as the same state.

For an N-particle gas, there are N! states which are identical in this sense, if one assumes that each particle is in a different single particle state. One can safely make this assumption provided the gas isn't at an extremely high density. Under normal conditions, one can thus calculate the volume of phase space occupied by the gas, by dividing Equation 1 by N!. Using the Stirling approximation again for large N, ln(N!) ≈ N ln(N) - N, the entropy for large N is:


S = k N \ln
\left[ \left(\frac VN\right)  \left(\frac UN \right)^{\frac 32}\right]+
{\frac 32}kN\left( {\frac 53}+ \ln\frac{4\pi m}{3h^2}\right)

which can be easily shown to be extensive. This is the Sackur-Tetrode equation.

The mixing paradox

A closely related paradox is the mixing paradox. Again take a box with a partition in it, with gas A on one side, gas B on the other side, and both gases are at the same temperature and pressure. If gas A and B are different gases, there is an entropy that arises due to the mixing. If the gases are the same, no additional entropy is calculated. Surprisingly, the additional entropy from mixing does not depend on the character of the gases. The paradox is that the two gases can be arbitrarily similar, but the entropy from mixing does not disappear unless they are the same gas.

The resolution is provided by a careful understanding of entropy. In particular, as explained concisely by Jaynes,[2] there is an arbitrariness in the definition of entropy.

A central example in Jaynes' paper relies on the fact that, if one develops a theory based on the idea that the two different types of gas are indistinguishable, and one never carries out any measurement which detects this fact, then the theory will have no internal inconsistencies. In other words, if we have two gases A and B and we have not yet discovered that they are different, then assuming they are the same will cause us no theoretical problems. If ever we perform an experiment with these gases that yields incorrect results, we will certainly have discovered a method of detecting their difference and recalculating the entropy increase when the partition is removed.

This insight suggests that the idea of thermodynamic state and entropy are somewhat subjective. The differential increase in entropy (dS), as a result of mixing dissimilar element sets (the gases), multiplied by the temperature (T) is equal to the minimum amount of work we must do to restore the gases to their original separated state. Suppose that the two different gases are separated by a partition, but that we cannot detect the difference between them. We remove the partition. How much work does it take to restore the original thermodynamic state? None - simply reinsert the partition. The fact that the different gases have mixed does not yield a detectable change in the state of the gas, if by state we mean a unique set of values for all parameters that we have available to us to distinguish states. The minute we become able to distinguish the difference, at that moment the amount of work necessary to recover the original macroscopic configuration becomes non-zero, and the amount of work does not depend on the magnitude of that difference.

This line of reasoning is particularly informative when considering the concepts of indistinguishable particles and correct Boltzmann counting. Boltzmann's original expression for the number of states availiable to a gas assumed that a state could be expressed in terms of a number of energy "sublevels" each of which contain a particular number of particles. While the particles in a given sublevel were considered indistinguishable from each other, particles in different sublevels were considered distinguishable from particles in any other sublevel. This amounts to saying that the exchange of two particles in two different sublevels will result in a detectably different "exchange macrostate" of the gas. For example, if we consider a simple gas with N particles, at sufficiently low density that it is practically certain that each sublevel contains either one particle or none (i.e. a Maxwell-Boltzmann gas), this means that a simple container of gas will be in one of N! detectably different "exchange macrostates", one for each possible particle exchange. Just as the mixing paradox begins with two detectably different containers, and the extra entropy that results upon mixing is proportional to the average amount of work needed to restore that initial state after mixing, so the extra entropy in Boltzmann's original derivation is proportional to the average amount of work required to restore the simple gas from some "exchange macrostate" to its original "exchange macrostate". If we assume that there is in fact no experimentally detectable difference in these "exchange macrostates" availiable, then using the entropy which results from assuming the particles are indistinguishable will yield a consistent theory. This is "correct Boltzmann counting". It is often said that the resolution to the Gibbs paradox derives from the fact that, according to the quantum theory, like particles are indistinguishable in principle. By Jaynes' reasoning, if the particles are experimentally indistinguishable for whatever reason, Gibbs paradox is resolved, and quantum mechanics only provides an assurance that in the quantum realm, this indistinguishability will be true as a matter of principle, rather than being due to an insufficiently refined experimental capability.

References

  1. ^ Gibbs, J. Willard (1875-1878). On the Equilibrium of Heterogeneous Substances. Connecticut Acad. Sci.. ISBN 0849396859.  Reprinted in
    • Gibbs, J. Willard (October 1993). The Scientific Papers of J. Willard Gibbs (Vol. 1). Ox Bow Press. ISBN 0-918024-77-3. 
    • Gibbs, J. Willard (February 1994). The Scientific Papers of J. Willard Gibbs (Vol. 2). Ox Bow Press. ISBN 1-881987-06-X. 
  2. ^ a b Jaynes, E.T. (1996). "The Gibbs Paradox" (PDF). http://bayes.wustl.edu/etj/articles/gibbs.paradox.pdf. Retrieved November 8, 2005. 
  3. ^ The Many Face of Entropy, Harold Grad, 1961
  4. ^ N. G. van Kampen, “The Gibbs Paradox” in Essays in Theoretical Physics in Honor of Dirk ter Haar, ed. by W. E. Parry (Pergamon, Oxford, 1984).
  • Tseng, Caticha (2001). Yet another resolution of the Gibbs paradox: an information theory approach. "AIP Conference Proceedings". P. In "Bayesian Inference and Maximum Entropy Methods in Science and Engineering" ed. By R. L. Fry (A.I.P. Vol. 617 (617): 331. arXiv:cond-mat/0109324. doi:10.1063/1.1477057. 
  • Dieks, Dennis (2010). "The Gibbs Paradox Revisited". arXiv:1003.0179 [quant-ph]. 
  • Grad (1964). "The Many Faces of Entropy". 
  • N. G. van Kampen (1984?). "The Gibbs Paradox" in Essays in Theoretical Physics: In Honour of Dirk Ter Haar. 

External links


Wikimedia Foundation. 2010.

Игры ⚽ Нужно решить контрольную?

Look at other dictionaries:

  • Gibbs paradox — Gibso paradoksas statusas T sritis fizika atitikmenys: angl. Gibbs paradox vok. Gibbssches Paradoxon, n rus. парадокс Гиббса, m pranc. paradoxe de Gibbs, m …   Fizikos terminų žodynas

  • Gibbs — may refer to:People*Cecil Armstrong Gibbs, composer *Cory Gibbs, soccer player *Frederic A. Gibbs, neurologist *George Gibbs (mineralogist), (1776 1833) *George Gibbs (geologist), (1815 1873) *Herschelle Gibbs, South African cricketer *Humphrey… …   Wikipedia

  • Gibbs free energy — Thermodynamics …   Wikipedia

  • Josiah Willard Gibbs — Infobox Scientist box width = 300px name = J. Willard Gibbs image size = 300px caption = Josiah Willard Gibbs birth date = birth date|1839|2|11|mf=y birth place = New Haven, Connecticut, USA death date = death date and… …   Wikipedia

  • Physical paradox — A physical paradox is an apparent contradiction in physical descriptions of the universe. While many physical paradoxes have accepted resolutions, others defy resolution and may indicate flaws in theory. In physics as in all of science,… …   Wikipedia

  • paradoxe de Gibbs — Gibso paradoksas statusas T sritis fizika atitikmenys: angl. Gibbs paradox vok. Gibbssches Paradoxon, n rus. парадокс Гиббса, m pranc. paradoxe de Gibbs, m …   Fizikos terminų žodynas

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Bonifati Kedrow — Bonifatij Michailowitsch Kedrow (russisch Бонифатий Михайлович Кедров; * 10. Dezember 1903 in Jaroslawl; † 10. September 1985 in Moskau) war ein russischer Philosoph und Wissenschaftshistoriker. Seine Studien über die Geschichte der Chemie… …   Deutsch Wikipedia

  • List of mathematics articles (G) — NOTOC G G₂ G delta space G networks Gδ set G structure G test G127 G2 manifold G2 structure Gabor atom Gabor filter Gabor transform Gabor Wigner transform Gabow s algorithm Gabriel graph Gabriel s Horn Gain graph Gain group Galerkin method… …   Wikipedia

  • Bonifati Michailowitsch Kedrow — Bonifatij Michailowitsch Kedrow (russisch Бонифатий Михайлович Кедров; * 27. Novemberjul./ 10. Dezember 1903greg. in Jaroslawl; † 10. September 1985 in Moskau) war ein russischer Philosoph und Wissenschaftshistoriker. Seine… …   Deutsch Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”