# Entropy (classical thermodynamics)

Entropy (classical thermodynamics)

In thermodynamics, entropy is a measure of how close a thermodynamic system is to equilibrium. A thermodynamic system is any physical object or region of space that can be described by its thermodynamic quantities such as temperature, pressure, volume and density. In simple terms, the second law of thermodynamics states that for a system, the differences in intensive thermodynamic quantities such as temperature, pressure, and chemical potential tend to even out as time goes by, unless there is an outside influence which works to maintain the differences. The end point of this evening-out process is called equilibrium, and as the system moves toward equilibrium, the entropy of the system increases, becoming a maximum at equilibrium.

There are two equivalent definitions of entropy. The first definition is the thermodynamic definition. It was developed in the early 1850s by Rudolf Clausius and essentially describes how to measure entropy. It makes no reference to the microscopic nature of matter. The second definition is the statistical definition developed by Ludwig Boltzmann in the 1870s. This definition describes the entropy as a measure of the number of possible microscopic configurations of the individual atoms and molecules of the system (microstates) which would give rise to the observed macroscopic state (macrostate) of the system. Boltzmann then went on to show that this definition of entropy was equal to the thermodynamic entropy to within a constant number which has since been known as Boltzmann's constant.

This article is concerned with the thermodynamic definition of entropy. Although thermodynamic entropy is a self-contained subject, it should be understood in parallel with the statistical definition. When the thermodynamic definition becomes most difficult to understand, the statistical definition brings a simple explanation, and where the link between the statistical theory and experiment becomes extended, the thermodynamic theory delivers a straightforward answer.

Clausius defined the "change in entropy" "dS" of a thermodynamic system, during a reversible process, as

:$dS = frac\left\{delta Q\right\}\left\{T\right\} !$where: δ"Q" is a small amount of heat introduced to the system,: "T" is a constant absolute temperature

Note that the small amount $delta Q$ of energy transferred by heating is denoted by $delta Q$ rather than $dQ$, because "Q" is not a state function while the entropy is.

Clausius gave the quantity "S" the name "entropy", from the Greek word "τρoπή", "transformation". Since this definition involves only differences in entropy, the entropy itself is only defined up to an arbitrary additive constant.

When a process is irreversible, the above definition must be replaced by the statement that the entropy change is equal to the amount of energy required to return the system to its original state by a reversible transformation at a constant temperature, divided by that temperature. This is explained in more detail below.

Introduction

In a thermodynamic system, a "universe" consisting of "surroundings" and "systems" and made up of quantities of matter, its pressure differences, density differences, and temperature differences all tend to equalize over time. In the ice melting example, the difference in temperature between a warm room (the surroundings) and cold glass of ice and water (the system and not part of the room), begins to be equalized as portions of the heat energy from the warm surroundings become spread out to the cooler system of ice and water. Over time the temperature of the glass and its contents and the temperature of the room become equal. The entropy of the room has decreased and some of its energy has been transferred to the ice and water. However, as calculated in the example, the entropy of the system of ice and water has increased more than the entropy of the surrounding room has decreased. In an isolated system such as the room and ice water taken together, the dispersal of energy from warmer to cooler always results in a net increase in entropy. Thus, when the 'universe' of the room and ice water system has reached a temperature equilibrium, the entropy change from the initial state is at a maximum. The entropy of the thermodynamic system is a measure of how far the equalization has progressed.

A special case of entropy increase, the entropy of mixing, occurs when two or more different substances are mixed. If the substances are at the same temperature and pressure, there will be no net exchange of heat or work - the entropy increase will be entirely due to the mixing of the different substances. [See, e.g., [http://www.entropysite.com/calpoly_talk.html Notes for a “Conversation About Entropy”] for a brief discussion of "both" thermodynamic and "configurational" ("positional") entropy in chemistry.]

From a "macroscopic perspective", in classical thermodynamics the entropy is interpreted simply as a state function of a thermodynamic system: that is, a property depending only on the current state of the system, independent of how that state came to be achieved. The state function has the important property that, when multiplied by a reference temperature, it can be understood as a measure of the amount of energy in a physical system that cannot be used to do thermodynamic work; i.e., work mediated by thermal energy. More precisely, in any process where the system gives up energy Δ"E", and its entropy falls by Δ"S", a quantity at least "T"R Δ"S" of that energy must be given up to the system's surroundings as unusable heat ("T"R is the temperature of the system's external surroundings). Otherwise the process will not go forward.

In 1862, Clausius stated what he calls the “theorem respecting the equivalence-values of the transformations” or what is now known as the second law of thermodynamics, as such:

:"The algebraic sum of all the transformations occurring in a cyclical process can only be positive, or, as an extreme case, equal to nothing."

Quantitatively, Clausius states the mathematical expression for this theorem is as follows. Let "δQ" be an element of the heat given up by the body to any reservoir of heat during its own changes, heat which it may absorb from a reservoir being here reckoned as negative, and "T" the absolute temperature of the body at the moment of giving up this heat, then the equation:

:$int frac\left\{delta Q\right\}\left\{T\right\} = 0$ must be true for every reversible cyclical process, and the relation:

:$int frac\left\{delta Q\right\}\left\{T\right\} ge 0$

must hold good for every cyclical process which is in any way possible. This is the essential formulation of the second law and one of the original forms of the concept of entropy. It can be seen that the dimensions of entropy are energy divided by temperature, which is the same as the dimensions of Boltzmann's constant (k) and heat capacity. The SI unit of entropy is "joule per kelvin" (J•K−1). In this manner, the quantity "ΔS" is utilized as a type of internal ordering energy, which accounts for the effects of irreversibility, in the energy balance equation for any given system. In the Gibbs free energy equation, i.e. ΔG = ΔH - TΔS, for example, which is a formula commonly utilized to determine if chemical reactions will occur, the energy related to entropy changes TΔS is subtracted from the "total" system energy ΔH to give the "free" energy ΔG of the system, as during a chemical process or as when a system changes state.

Heat engines

Clausius' identification of "S" as a significant quantity was motivated by the study of reversible and irreversible thermodynamic transformations. A thermodynamic transformation is a change in a system's thermodynamic properties, such as temperature and volume. A transformation is reversible if it is quasistatic which means that it is infinitesimally close to thermodynamic equilibrium at all times. Otherwise, the transformation is irreversible. To illustrate this, consider a gas enclosed in a piston chamber, whose volume may be changed by moving the piston. If we move the piston slowly enough, the density of the gas is always homogeneous, so the transformation is reversible. If we move the piston quickly, pressure waves are created, so the gas is not in equilibrium, and the transformation is irreversible.

A heat engine is a thermodynamic system that can undergo a sequence of transformations which ultimately return it to its original state. Such a sequence is called a cyclic process, or simply a "cycle". During some transformations, the engine may exchange energy with the environment. The net result of a cycle is (i) mechanical work done by the system (which can be positive or negative, the latter meaning that work is done "on" the engine), and (ii) heat energy transferred from one part of the environment to another. By the conservation of energy, the net energy lost by the environment is equal to the work done by the engine.

If every transformation in the cycle is reversible, the cycle is reversible, and it can be run in reverse, so that the energy transfers occur in the opposite direction and the amount of work done switches sign.

Definition of temperature

In thermodynamics, absolute temperature is "defined" in the following way. Suppose we have two heat reservoirs, which are systems sufficiently large that their temperatures do not change when energy flows into or out of them. A reversible cycle exchanges heat with the two heat reservoirs:

:

Now consider a reversible cycle in which the engine exchanges heats $delta Q_1,delta Q_2,cdots,delta Q_N$ with a sequence of "N" heat reservoirs with temperatures "T1", ..., "TN". We can show (see the box on the right) that:

:$sum_\left\{i=1\right\}^N frac\left\{delta Q_i\right\}\left\{T_i\right\} = 0 !$.where:"δQ"i is the amount of heat exchanged "from" the engine "to" the reservoir,:"T"i is the temperature of each heat reservoir, and:"i" is an integer representing each different heat reservoir.

Since the cycle is reversible, the engine is always infinitesimally close to equilibrium, so its temperature is equal to any reservoir with which it is contact. In the limiting case of a reversible cycle consisting of a "continuous" sequence of transformations,

:$oint frac\left\{delta Q\right\}\left\{T\right\} = 0 ,!$ (reversible cycles)

where the integral is taken over the entire cycle, and "T" is the temperature of the system at each point in the cycle. This is a particular case of the Clausius theorem.

Entropy as a state function

We can now deduce an important fact about the entropy change during "any" thermodynamic transformation, not just a cycle. First, consider a reversible transformation that brings a system from an equilibrium state "A" to another equilibrium state "B". If we follow this with "any" reversible transformation which returns that system to state "A", our above result says that the net entropy change is zero. This implies that the entropy change in the first transformation depends "only on the initial and final states".

This allows us to define the entropy of any "equilibrium" state of a system. Choose a reference state "R" and call its entropy "SR". The entropy of any equilibrium state "X" is

:$S_X = S_R + int_R^X frac\left\{delta Q\right\}\left\{T\right\} ,!$

Since the integral is independent of the particular transformation taken, this equation is well-defined.

Entropy change in irreversible transformations

We now consider irreversible transformations. It can be shown that the entropy change during any transformation between two "equilibrium" states is

:$Delta S ge int frac\left\{delta Q\right\}\left\{T\right\} ,!$

where the equality holds if the transformation is reversible.

Notice that if $delta Q=0$, then Δ"S" ≥ 0. This is the Second Law of Thermodynamics, discussed earlier in the article.

Suppose a system is thermally and mechanically isolated from the environment. For example, consider an insulating rigid box divided by a movable partition into two volumes, each filled with gas. If the pressure of one gas is higher, it will expand by moving the partition, thus performing work on the other gas. Also, if the gases are at different temperatures, heat can flow from one gas to the other provided the partition is an imperfect insulator. Our above result indicates that the entropy of the system "as a whole" will increase during these process (it could in principle remain constant, but this is unlikely.) Typically, there exists a maximum amount of entropy the system may possess under the circumstances. This entropy corresponds to a state of "stable equilibrium", since a transformation to any other equilibrium state would cause the entropy to decrease, which is forbidden. Once the system reaches this maximum-entropy state, no part of the system can perform work on any other part. It is in this sense that entropy is a measure of the energy in a system that "cannot be used to do work".

Measuring entropy

In real experiments, it is quite difficult to measure the entropy of a system. The techniques for doing so are based on the thermodynamic definition of the entropy, and require extremely careful calorimetry.

For simplicity, we will examine a mechanical system, whose thermodynamic state may be specified by its volume "V" and pressure "P". In order to measure the entropy of a specific state, we must first measure the heat capacity at constant volume and at constant pressure (denoted "CV" and "CP" respectively), for a successive set of states intermediate between a reference state and the desired state. The heat capacities are related to the entropy "S" and the temperature "T" by

:$C_X = T left\left(frac\left\{partial S\right\}\left\{partial T\right\} ight\right)_X ,!$

where the "X" subscript refers to either constant volume or constant pressure. This may be integrated numerically to obtain a change in entropy:

:$Delta S = int frac\left\{C_X\right\}\left\{T\right\} dT ,!$

We can thus obtain the entropy of any state ("P","V") with respect to a reference state ("P0","V0"). The exact formula depends on our choice of intermediate states. For example, if the reference state has the same pressure as the final state,

:$S\left(P,V\right) = S\left(P, V_0\right) + int^\left\{T\left(P,V\right)\right\}_\left\{T\left(P,V_0\right)\right\} frac\left\{C_P\left(P,V\left(T,P\right)\right)\right\}\left\{T\right\} dT ,!$

In addition, if the path between the reference and final states lies across any first order phase transition, the latent heat associated with the transition must be taken into account.

The entropy of the reference state must be determined independently. Ideally, one chooses a reference state at an extremely high temperature, at which the system exists as a gas. The entropy in such a state would be that of a classical ideal gas plus contributions from molecular rotations and vibrations, which may be determined spectroscopically. Choosing a "low" temperature reference state is sometimes problematic since the entropy at low temperatures may behave in unexpected ways. For instance, a calculation of the entropy of ice by the latter method, assuming no entropy at zero temperature, falls short of the value obtained with a high-temperature reference state by 3.41 J/(mol·K). This is due to the "zero-point" entropy of ice mentioned earlier.

*Entropy
*Enthalpy
*Thermodynamic free energy
*History of entropy
*Entropy (statistical views)

References

*Goldstein, Martin, and Inge F., 1993. "The Refrigerator and the Universe". Harvard Univ. Press. A gentle introduction at a lower level than this entry.

* [http://physics.thinkingpal.com/a-simple-and-short-explanation-of-entropy/ Thermodynamic entropy]

Wikimedia Foundation. 2010.

### Look at other dictionaries:

• Entropy (statistical thermodynamics) — In thermodynamics, statistical entropy is the modeling of the energetic function entropy using probability theory. The statistical entropy perspective was introduced in 1870 with the work of the Austrian physicist Ludwig Boltzmann. Mathematical… …   Wikipedia

• Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the …   Wikipedia

• Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

• Entropy (disambiguation) — Additional relevant articles may be found in the following categories: Thermodynamic entropy Entropy and information Quantum mechanical entropy Entropy, in thermodynamics, is a measure of the energy in a thermodynamic system not available to do… …   Wikipedia

• Entropy (general concept) — In many branches of science, entropy refers to a certain measure of the disorder of a system. Entropy is particularly notable as it has a broad, common definition that is shared across physics, mathematics and information science. Although the… …   Wikipedia

• Thermodynamics — Annotated color version of the original 1824 Carnot heat engine showing the hot body (boiler), working body (system, steam), and cold body (water), the letters labeled according to the stopping points in Carnot cycle …   Wikipedia

• thermodynamics — thermodynamicist, n. /therr moh duy nam iks/, n. (used with a sing. v.) the science concerned with the relations between heat and mechanical energy or work, and the conversion of one into the other: modern thermodynamics deals with the properties …   Universalium

• Entropy (energy dispersal) — The thermodynamic concept of entropy can be described qualitatively as a measure of energy dispersal (energy distribution) at a specific temperature. Changes in entropy can be quantitatively related to the distribution or the spreading out of the …   Wikipedia

• Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… …   Wikipedia

• Entropy (arrow of time) — Entropy is the only quantity in the physical sciences that picks a particular direction for time, sometimes called an arrow of time. As one goes forward in time, the second law of thermodynamics says that the entropy of an isolated system can… …   Wikipedia