Introduction to entropy

Introduction to entropy

Thermodynamic entropy provides a measure of certain aspects of energy in relation to absolute temperature. The thermodynamic entropy S, often simply called the entropy in the context of thermodynamics, is a measure of the amount of energy in a physical system that cannot be used to do work. It is also a measure of the disorder present in a system. In thermodynamics, entropy is one of the three basic thermodynamic potentials: "U" (internal energy), "S" (entropy) and "A" (Helmholtz energy). Entropy is a measure of the uniformity of the distribution of energy.


The concept of thermodynamic entropy is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. In a general sense the second law says that temperature differences between systems in contact with each other tend to even out and that work can be obtained from these non-equilibrium differences, but that loss of heat occurs, in the form of entropy, when work is done.

The concept of energy is central to the first law of thermodynamics, which deals with the conservation of energy and under which the loss in heat will result in a decrease in the internal energy of the thermodynamic system. Thermodynamic entropy provides a comparative measure of the amount of this decrease in internal energy of the system and the corresponding increase in internal energy of the surroundings at a given temperature. A simple and more concrete visualisation of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. Entropy change is the quantitative measure of that kind of a spontaneous process: how much energy has flowed or how widely it has become spread out at a specific temperature.

Entropy has been developed to describe any of several phenomena, depending on the field and the context in which it is being used. Information entropy takes the mathematical concepts of statistical thermodynamics into areas of probability theory unconnected with heat and energy.

Entropy is an integral part of the second law of thermodynamics, which can be stated as saying that:

"Temperature differences between thermodynamic systems in contact with each other tend to even out and that work can be obtained from these non-equilibrium differences, but that loss of heat occurs, in the form of entropy, when work is done."

In this form it provides a measure of the extent to which a heat engine can never completely recycle unused heat into work as a perpetual motion machine might, but will always convert some of the heat into internal energy due to intermolecular interactions, which is not available to do work. Entropy also relates to all kinds of energy and to other fields of science such as chemistry, where the entropy of materials measures the energy required to raise the material to its state at a given temperature from absolute zero.

In calculations, entropy is symbolised by S and is a measure at a particular instant, a state function. Thus entropy as energy Q in relation to absolute temperature T is expressed as S = Q/T. Often change in entropy, symbolised by ΔS, is referred to in relation to change in energy, δQ.

Statistical mechanics introduces calculation of entropy using probability theory to find the number of possible microstates at an instant. At any given instant, the system is in one of these microstates, each of which are equally probable if the system is in equilibrium. Each microstate is a possible distribution of energy amongst the different forms of energy in the system such as potential, kinetic and rotational energy of the elements in the system. The calculation of entropy is the natural logrithm of the total possible microstates. The natural logrithm is used for two reasons. First, the sheer number of microstates is too large for practicle considerations. Second, the natural logrithms are additive which is easier to work with than the numbers of microstates which, being probabilities, are multiplicative. ["Introduction to Statistical Mechanics and Thermodynamics" by Keith Stowe, pages 128-129.] Statistical mechanical entropy is mathematically similar to Shannon entropy which is part of information theory, where energy is not involved. This similarity means that some probabilistic aspects of thermodynamics are replicated in information theory.

Example of entropy increasing

Ice melting provides a classic example in which entropy increases in a small 'universe', a thermodynamic system consisting of the 'surroundings' (the warm room) and the 'system' of glass, ice, cold water which has been allowed to reach thermodynamic equilibrium at the melting temperature of ice. In this universe, some heat energy "δQ" from the warmer room surroundings at 298 K (77°F, 25°C) will spread out to the cooler system of ice and water at its constant temperature "T" of 273 K (32°F, 0°C), the melting temperature of ice. Thus, the entropy of the system, which is "δQ/T", increases by "δQ/273" K. (The heat "δQ" for this process is the energy required to change water from the solid state to the liquid state, and is called the enthalpy of fusion, i.e. the "ΔH" for ice fusion.) It is important to realize that the entropy of the surrounding room decreases less than the entropy of the ice and water increases: the room temperature of 298 K is larger than 273 K and therefore the ratio, (entropy change), of "δQ/298" K for the surroundings is smaller than the ratio (entropy change), of "δQ/273" K for the ice+water system. This is always true in spontaneous events in a thermodynamic system and it shows "the predictive importance of entropy: the final net entropy after such an event is always greater than was the initial entropy." As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the "δQ/T" over the continuous range, “at many increments”, in the initially cool to finally warm water can be found by calculus. The entire miniature ‘universe’, i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that ‘universe’ than when the glass of ice + water was introduced and became a 'system' within it.

Origins and uses

Originally, entropy was named to describe the "waste heat", or more accurately energy losses, from heat engines and other mechanical devices which could never run with 100% efficiency in converting energy into work. Later, the term came to acquire several additional descriptions as more came to be understood about the behavior of molecules on the microscopic level. In the late 19th century the word "disorder" was used by Ludwig Boltzmann in developing statistical views of entropy using probability theory to describe the increased molecular movement on the microscopic level. That was before quantum behavior came to be better understood by Werner Heisenberg and those who followed. Descriptions of thermodynamic (heat) entropy on the microscopic level are found in statistical thermodynamics and statistical mechanics.

For most of the 20th century textbooks tended to describe entropy as "disorder", following Boltzmann's early conceptualisation of the motional energy of molecules. More recently there has been a trend in chemistry and physics textbooks to describe entropy in terms of "dispersal of energy". Entropy can also involve the dispersal of particles, which are themselves energetic. Thus there are instances where both particles and energy disperse at different rates when substances are mixed together.

The mathematics developed in statistical thermodynamics were found to be applicable in other disciplines. In particular, information sciences developed the concept of information entropy where a constant replaces the temperature which is inherent in thermodynamic entropy.

Heat and entropy

At a microscopic level kinetic energy of molecules is responsible for the temperature of a substance or a system. “Heat” is the kinetic energy of molecules being transferred: when motional energy is transferred from hotter surroundings to a cooler system, faster moving molecules in the surroundings collide with the walls of the system and some of their energy gets to the molecules of the system and makes them move faster.

* (molecules in a gas like nitrogen at room temperature at any instant are moving at an average speed of nearly a thousand miles an hour, constantly colliding and therefore exchanging energy so that their individual speeds are always changing, even being motionless for an instant if two molecules with exactly the same speed collide head-on, before another molecule hits them and they race off, as fast as 2500 miles an hour. At higher temperatures average speeds increase and motional energy becomes considerably greater.)
** Thus motional molecular energy (‘heat energy’) from hotter surroundings, like faster moving molecules in a flame or violently vibrating iron atoms in a hot plate, will melt or boil a substance (the system) at the temperature of its melting or boiling point. That amount of motional energy from the surroundings that is required for melting or boiling is called the phase change energy, specifically the enthalpy of fusion or of vaporization, respectively. This phase change energy breaks bonds between the molecules in the system (not chemical bonds inside the molecules that hold the atoms together) rather than contributing to the motional energy and making the molecules move any faster – so it doesn’t raise the temperature, but instead enables the molecules to break free to move as a liquid or as a vapor.
** In terms of energy, when a solid becomes a liquid or a liquid a vapor, motional energy coming from the surroundings is changed to ‘ potential energy ‘ in the substance (phase change energy, which is released back to the surroundings when the surroundings become cooler than the substance's boiling or melting temperature, respectively). Phase change energy increases the entropy of a substance or system because it is energy that must be spread out in the system from the surroundings so that the substance can exist as a liquid or vapor at a temperature above its melting or boiling point. When this process occurs in a ‘universe’ that consists of the surroundings plus the system, the total energy of the universe becomes more dispersed or spread out as part of the greater energy that was only in the hotter surroundings transfers so that some is in the cooler system. This energy dispersal increases the entropy of the 'universe'.

The important overall principle is that "”Energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. Entropy (or better, entropy change) is the quantitative measure of that kind of a spontaneous process: how much energy has been transferred/T or how widely it has become spread out at a specific temperature."

Classical calculation of entropy

When entropy was first defined and used in 1865 the very existence of atoms was still controversial and there was no concept that temperature was due to the motional energy of molecules or that “heat” was actually the transferring of that motional molecular energy from one place to another. Entropy change, Delta S, was described in macro terms that could be measured, such as volume or temperature or pressure. The 1865 equation, which is still completely valid, is that Delta S = frac{q_{rev{T} . This can be expanded, part by part, in modern terms of how molecules are responsible for what is happening. Here is that equation expanded:

* Delta S = the entropy of a system (i.e., of a substance or a group of substances), after some motional energy (“heat”) has been transferred to it by fast moving molecules, minus the entropy of that system before any such energy was transferred to it. So, Delta S = S_{final} - S _{initial}.

* Then, Delta S = S_{final} - S _{initial} = q, the motional energy (“heat”) that is transferred "reversibly" (rev) to the system from the surroundings (or from another system of in contact with the first system) divided by T, the absolute temperature at which the transfer occurs = frac{q_{rev{T}.
** “Reversible” or “reversibly” (rev) simply means that T, the temperature of the system, has to stay (almost) exactly the same while any energy is being transferred to or from it. That’s easy in the case of phase changes, where the system absolutely must stay in the solid or liquid form until enough energy is given to it to break bonds between the molecules before it can change to a liquid or a gas. For example in the melting of ice at 273.15 K, no matter what temperature the surroundings are – from 273.20 K to 500 K or even higher, the temperature of the ice will stay at 273.15 K until the last molecules in the ice are changed to liquid water, i.e., until all the hydrogen bonds between the water molecules in ice are broken and new, less-exactly fixed hydrogen bonds between liquid water molecules are formed. This amount of energy necessary for ice melting per mole has been found to be 6008 joules at 273 K. Therefore, the entropy change per mole is frac{q_{rev{T} = frac{6008 J}{273 K} or frac{22 J}{K}.
** When the temperature isn't at the melting or boiling point of a substance no intermolecular bond-breaking is possible, and so any motional molecular energy (“heat”) from the surroundings transferred to a system raises its temperature, making its molecules move faster and faster. As the temperature is constantly rising, there is no longer a particular value of “T” at which energy is transferred. However, a "reversible" energy transfer can be measured at a very small temperature increase, and a cumulative total can be found by adding each of many many small temperature intervals or increments. For example, to find the entropy change (frac{q_{rev{T}) from 300 K to 310 K, measure the amount of energy transferred at dozens or hundreds of temperature increments, say from 300.00 K to 300.01 K and then 300.01 to 300.02 and so on, dividing the q by each T, and finally adding them all.
** Calculus can be used to make this calculation easier if the effect of energy input to the system is linearly dependent on the temperature change, as in simple heating of a system at moderate to relatively high temperatures. Thus, the energy being transferred “per incremental change in temperature” (the heat capacity, Cp), multiplied by the integral of frac{dT}{T} from T_{initial}to T_{final}, is directly given by Delta S = Cp ln(frac{T_{final{T_{initial).

Introductory descriptions of entropy

Traditionally, 20th century textbooks have introduced entropy as order and disorder so that it provides "a measurement of the disorder or randomness of a system". It has been argued that ambiguities in the terms used (such as "disorder" and "chaos") contribute to widespread confusion and can hinder comprehension of entropy for most students. A more recent formulation associated with Frank L. Lambert describing entropy as energy dispersal describes entropy as measuring "the spontaneous dispersal of energy — at a specific temperature."

Notes and references

ee also


Wikimedia Foundation. 2010.

Look at other dictionaries:

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Entropy (disambiguation) — Additional relevant articles may be found in the following categories: Thermodynamic entropy Entropy and information Quantum mechanical entropy Entropy, in thermodynamics, is a measure of the energy in a thermodynamic system not available to do… …   Wikipedia

  • Entropy (classical thermodynamics) — In thermodynamics, entropy is a measure of how close a thermodynamic system is to equilibrium. A thermodynamic system is any physical object or region of space that can be described by its thermodynamic quantities such as temperature, pressure,… …   Wikipedia

  • Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the …   Wikipedia

  • Entropy (energy dispersal) — The thermodynamic concept of entropy can be described qualitatively as a measure of energy dispersal (energy distribution) at a specific temperature. Changes in entropy can be quantitatively related to the distribution or the spreading out of the …   Wikipedia

  • entropy — entropic /en troh pik, trop ik/, adj. entropically, adv. /en treuh pee/, n. 1. Thermodynam. a. (on a macroscopic scale) a function of thermodynamic variables, as temperature, pressure, or composition, that is a measure of the energy that is not… …   Universalium

  • Entropy encoding — In information theory an entropy encoding is a lossless data compression scheme that is independent of the specific characteristics of the medium. One of the main types of entropy coding creates and assigns a unique prefix free code to each… …   Wikipedia

  • Introduction to systolic geometry — Systolic geometry is a branch of differential geometry, a field within mathematics, studying problems such as the relationship between the area inside a closed curve C , and the length or perimeter of C . Since the area A may be small while the… …   Wikipedia

  • Principle of maximum entropy — This article is about the probability theoretic principle. For the classifier in machine learning, see maximum entropy classifier. For other uses, see maximum entropy (disambiguation). Bayesian statistics Theory Bayesian probability Probability… …   Wikipedia

  • Differential entropy — (also referred to as continuous entropy) is a concept in information theory that extends the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Contents 1 Definition 2… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”