Entropy

This article describes the physical term from thermodynamics. For other meanings, see Entropy (disambiguation).

Entropy (artificial word Ancient Greek ἐντροπία entropía, from ἐν en 'on', 'in' and τροπή tropḗ 'turn') is a fundamental thermodynamic quantity of state with the SI unit Joule per Kelvin (J/K).

All processes that occur spontaneously within a system cause an increase in its entropy, as does the addition of heat or matter. Such processes are e.g. mixing, heat conduction, chemical reaction or conversion of mechanical into thermal energy by friction (see dissipation, energy devaluation). The entropy of a system can only decrease through the release of heat or matter. Therefore, in a closed system (a system in which there is no exchange of energy or matter with the environment), entropy cannot decrease, but can only increase over time (Second Law of Thermodynamics). Processes in which entropy increases in a closed system cannot proceed in the reverse temporal direction without external intervention; they are called irreversible. In order to bring a system back to its initial state after an irreversible process, it must be coupled with its environment, which absorbs the increase in entropy and thereby also changes its own state.

For example, we observe that in a system consisting of a cold and a hot body in an insulating box, i.e. in a practically closed system, heat transport begins and the temperature difference disappears. Both bodies will have the same temperature after a certain time, with which the system will have reached the state of greatest entropy. In such a closed system, we practically never observe the spontaneous cooling of the colder body and the heating of the warmer one.

In statistical mechanics, the macrostate of a system, which is defined exclusively by macroscopic thermodynamic quantities, is the more probable the higher the number of microstates that can realize it and that can merge into each other through internal processes. This number therefore determines the entropy of the system in this macrostate. In a system left to itself in any initial state, the spontaneously occurring internal processes then cause the state of the system to approach with the greatest probability that macrostate which, with the same energy, can be realized by the greatest number of different microstates, i.e. which has the highest possible entropy.

This is often colloquially described by saying that entropy is a "measure of disorder". However, disorder is not a well-defined physical concept and therefore has no physical measure. It is more correct to think of entropy as an objective measure of the amount of information that would be required to infer the actual microstate of the system from an observable macrostate. This is what is meant when entropy is also paraphrased as a "measure of the ignorance of the states of all individual particles".

Play media file When ice melts, the ordered ice crystal structure is transformed into a disordered movement of individual water molecules: The entropy of the water in the ice cube increases in the process (Rudolf Clausius 1862)Zoom
Play media file When ice melts, the ordered ice crystal structure is transformed into a disordered movement of individual water molecules: The entropy of the water in the ice cube increases in the process (Rudolf Clausius 1862)

Historical overview

In the history of physics there was for a long time a dispute about the meaning of the concept of heat: one side held the theory that the phenomena of heat were solely due to the vis viva ("living force" = kinetic energy) of the atoms; the other claimed that heat was a substance, which was given the name caloricum (French calorique, English caloric).

Antoine Laurent de Lavoisier distinguished chaleur (heat) from calorique (caloricum) in 1789. Among other things, the caloricum was supposed to cause a repulsive force between the atoms of a solid, so that if a sufficient amount of caloricum was supplied, the solid would first become liquid and then gaseous. Together with Pierre Simon Laplace, he constructed an ice calorimeter. Lavoisier and Laplace, however, did not wish to determine whether the vis viva or the caloricum substance was the cause of the thermal phenomena. Joseph Black distinguished temperature from quantity of heat, partly on the basis of latent heat in melting. He remarked that the quantity of heat must be carried along with the steam escaping from a boiler.

Benjamin Thompson, Imperial Earl of Rumford, in his Munich days in 1798, investigated the temperature of chips produced in the drilling of cannon barrels. Due to the arbitrarily large amount of heat that could be generated from the mechanical drilling work, he doubted that the caloricum could be a (preserved) substance, thus giving a boost to the proponents of the vis viva theory.

The eponym of the Carnot process, Nicolas Léonard Sadi Carnot, wrote in 1824 that the power of a steam engine is not due to the consumption of calorique, but to its transport from a warm body to a cold one, thus preparing the concept of entropy. The experiments of Robert Mayer and James Prescott Joule in the early 1840s demonstrated that mechanical work could be quantitatively converted into heat. This was the basis for Hermann von Helmholtz's general formulation of the law of conservation of energy in 1847, i.e. the first law. Since then, the physical term heat has been fixed to its energetic meaning.

Another 20 years later, however, Rudolf Clausius found that when the form of energy heat is transferred, a second quantity-like quantity must also flow. He saw this quantity as the cause of disgregation during melting and called it entropy. As worked out by Wilhelm Ostwald in 1908 and Hugh Longbourne Callendar in 1911, Clausius' entropy corresponds to Lavoisier's and Carnot's calorique.

In 1875, Ludwig Boltzmann and Willard Gibbs succeeded in giving entropy a statistical definition which explains the previously macroscopically defined quantity in microscopic terms. The entropy Sof a macrostate is calculated by the probabilities p_{i}the microstates : i

{\displaystyle S=-k_{\mathrm {B} }\sum _{i}p_{i}\ln(p_{i})}

The proportionality factor k_{\mathrm {B} } is the Boltzmann constant, but Boltzmann himself did not determine its value.

Entropy defined statistically in this way can be usefully applied in many contexts.

Connections between entropy and information emerged as early as the 19th century through the discussion of Maxwell's demon, a thought experiment that became topical in the context of miniaturization in the computer age. Computer science uses Shannon's information entropy, which corresponds to statistical interpretation, as an abstract measure of information without direct reference to physical realization. Norbert Wiener also used the concept of entropy to describe information phenomena, but with the opposite sign. The fact that Shannon's convention has prevailed is mainly due to the better technical usability of his work.

Classical thermodynamics

In thermodynamics, a system can exchange energy with its environment in two ways: as heat or as work, whereby different variants of work exist depending on the system and process control, including volume work, magnetic work. In the course of such energy exchange, the entropy of both the system and the environment may change. Only if the sum of all entropy changes is positive, the change occurs spontaneously.

Basics

The entropy S(unit J/K) is an extensive state variable of a physical system and behaves additively like the volume, the electric charge or the amount of matter when several systems are united. The physicist Rudolf Clausius introduced this term in 1865 to describe circular processes. Dividing Sby the mass of the system yields the specific entropy swith the unit J/(kg-K) as the intensive state variable.

The differential \mathrm{d}S is, according to Clausius, for reversible processes between systems in equilibrium, the ratio of transferred heat δ \delta Q_{\mathrm{rev}}and absolute temperature T:

{\displaystyle \mathrm {d} S={\frac {\delta Q_{\mathrm {rev} }}{T}}\qquad (1)}

This entropy change is positive when heat is added and negative when heat is removed. In this notation, S \mathrm {d} uses a noncursive emphasize that it is a complete differential, unlike δ \delta Q, which cannot be a complete differential because Qis a process variable. In this context, then, the reciprocal absolute temperature plays the role of an "integrating evaluation factor" that turns the reversibly added or removed heat, an - mathematically speaking - incomplete differential, into an associated complete differential .\mathrm{d}S This makes the change in entropy for reversible processes - unlike heat added or removed - path-independent. With the definition of an arbitrary value for a reference state, the entropy thus becomes a state variable given solely by the respective state.

In this respect, entropy in reversible process control can also be defined as the "heat energy valued at {\tfrac 1T}". Further on, the problem of how far the energy of a system can be converted into work is treated.

Using the first law of thermodynamics, {\displaystyle \mathrm {d} U=\delta W+\delta Q}, so that the energy change {\displaystyle \mathrm {d} U} is composed of work supplied and heat, and sets for the work δ {\displaystyle \delta W=-p\,\mathrm {d} V+\mu \,\mathrm {d} N+\dots }all processes possible for the experimenter by means of changing the system variables, one obtains from (1) for the change of entropy as a function of the thermodynamic variables (still in the reversible case)

{\displaystyle \mathrm {d} S={\frac {1}{T}}(\mathrm {d} U+p\,\mathrm {d} V-\mu \,\mathrm {d} N-\dots )}

Clausius also treated irreversible processes and showed that in an isolated thermodynamic system the entropy can never decrease:

{\displaystyle \Delta S\geq 0\qquad (2),}

where the equal sign applies only to reversible processes. {\displaystyle \Delta S=S_{e}-S_{a}}is the entropy change of the system with S_{a}for the entropy of the state at the beginning of the state change and {\displaystyle S_{e}}for the state at the end of the process.

From (2), for closed systems where thermal energy can pass through the system boundaries, the inequality follows:

{\displaystyle \Delta S\geq \Delta S_{Q}=\int {\frac {\delta Q}{T}}\qquad (3a)}

{\displaystyle \Delta S_{Q}} is the entropy fraction resulting from the supply of heat across the system boundary. The formula also applies to the removal of heat from the system, in which case Δ {\displaystyle \Delta S_{Q}}negative. Inequality (3a) becomes an equation only for purely reversible processes.

When analyzing thermodynamic systems in engineering, one often performs a balance analysis. To do this, one writes the inequality (3a) in the following form:

{\displaystyle \Delta S=\Delta S_{Q}+\Delta S_{\mathrm {irr} }\qquad (3)}

Here Δ {\displaystyle \Delta S_{\mathrm {irr} }\geq 0} is the entropy fraction that arises from irreversible processes inside the system. These include, for example, mixing processes after the removal of an internal partition, thermal equalization processes, the conversion of electrical or mechanical energy (ohmic resistance, stirring) into heat, and chemical reactions. If the irreversible processes are restricted exclusively to the dissipation of mechanical or electrical work δ {\displaystyle \delta W_{\mathrm {diss} }}, then Δ can be {\displaystyle P_{\mathrm {diss} }}expressed {\displaystyle \Delta S_{\mathrm {irr} }}by the work or dissipated power

{\displaystyle \Delta S_{\mathrm {irr} }=\int {\frac {\delta W_{\mathrm {diss} }}{T}}=\int {\frac {P_{\mathrm {diss} }}{T}}dt}

If the irreversible process runs quasistatically, so that the system is always close to an equilibrium state, then (3) can also be written with time derivatives.

{\displaystyle {\dot {S}}={\dot {S}}_{Q}+{\dot {S}}_{\mathrm {irr} }\qquad }

Here, {\displaystyle {\dot {S}}_{Q}}is called the entropy transport stream and {\displaystyle {\dot {S}}_{irr}}the entropy production stream.

From the first law of thermodynamics

\Delta U=W+Q

it follows that the product (Q=)\,T\Delta S \Delta Urepresents the non-utilized part ("waste heat") in the isothermal generation of work Wfrom available internal energy Δ The maximum value of this work is the so-called free energy

\Delta F = \Delta U - T \Delta S.

This is an equivalent form of the 2nd main theorem.

One consequence of this is the impossibility of a perpetual motion machine of the 2nd kind. Clausius formulated:

"No cycle exists whose sole effect is to transfer heat from a colder reservoir to a warmer reservoir."

Obviously, otherwise one would have constructed an inexhaustible source of energy. If it were possible to construct such a cyclic process, one could continuously take energy from the warm reservoir and do work with it. The dissipated work would then be supplied to the cold reservoir and would again benefit the warm reservoir via the mentioned circular process. Equivalent to this is the formulation of William Thomson, the later Lord Kelvin:

"No cycle exists that takes a quantity of heat from a reservoir and turns it entirely into work."

An ideal process that can be reversed at any time without friction losses is also called reversible. Often the entropy remains unchanged during a process, Δ \Delta S=0, well-known example is the adiabatic compression and expansion in the cycle of a Carnot machine. Changes of state with constant entropy are also called isentropic, but not all isentropic changes of state are adiabatic. However, if a process is adiabatic and reversible, it always follows that it is also isentropic.

If the heat T_{\rm h} is Q_{\rm h}absorbed in a cycle at the temperature the amount of heat Q_{\rm l} is released againT_{\rm l} at and if the heat absorption and release are reversible, then the entropy does not change:

{\displaystyle \oint {\rm {d}}S=0}; or {\displaystyle {\frac {Q_{\rm {h}}}{T_{\rm {h}}}}={\frac {Q_{\rm {l}}}{T_{\rm {l}}}}\,}.

From this, the maximum work done W = Q_{\rm h} - Q_{\rm l}and the maximum efficiency η \eta , the so-called Carnot efficiency, can be derived:

{\displaystyle \eta ={\frac {W}{Q_{\rm {h}}}}={\frac {T_{\rm {h}}-T_{\rm {l}}}{T_{\rm {h}}}}\,.}

The Carnotian efficiency represents the maximum work output for all heat engines. Real machines usually have a considerably lower efficiency. In them, part of the theoretically available work is dissipated, e.g. by friction. Consequently, entropy is produced in a real machine and more heat is dissipated to the cold reservoir than is necessary. It therefore works irreversibly.

The third law (the so-called "Nernst heat theorem") defines the entropy of a perfectly crystalline substance, in which, for example, no spin degeneracy occurs, as zero at absolute zero:

S(T=0) \equiv 0\,.

One conclusion is, for example, that the heat capacity of a system vanishes at low temperatures and, above all, that the absolute temperature zero is not attainable (this also applies to spin degeneracy).

If a substance does not fulfil the condition of being perfectly crystalline (e.g. if there are several configurations or if it is a glass), an entropy can also be attributed to it at absolute zero (zero point entropy).

Partial derivatives of entropy

From the 2nd law follow statements about the partial derivatives of entropy, e.g. according to the temperature Tor the volume V. Using the second law, it first holds that for reversible changes of state {\displaystyle \mathrm {d} S={\tfrac {\delta Q_{\mathrm {reversibel} }}{T}}}. Together with the first law, it follows that {\displaystyle \mathrm {d} S={\tfrac {\mathrm {d} U-\delta W}{T}}}, because according to the first law for the internal energy Uit holds that the sum of the work supplied to the system under consideration δ \delta Wand the heat supplied δ \delta Q(individually not state functions!) yields a state function, namely the "internal energy" of the system. It was assumed that the changes of volume and temperature occur adiabatically-slowly, so that no irreversible processes are generated.

So

{\displaystyle \mathrm {d} S={\frac {1}{T}}{\frac {\partial U(T,V)}{\partial V}}\,{\mathrm {d} V}+{\frac {1}{T}}{\frac {\partial (U(T,V)+p\cdot V(T))}{\partial T}}\,\mathrm {d} T,}

Where δ {\displaystyle \delta W=-p\,\mathrm {d} V}was used.

\Rightarrow\frac{\partial S}{\partial V}=\frac{1}{T}{\frac{\partial U(T,V)}{\partial V} }resp.

{\frac {\partial S}{\partial T}}={\frac {1}{T}}{\frac {\partial (U(T,V)+p\cdot V(T))}{\partial T}}.

Similar relationships arise when the system depends on other variables besides density or volume, such as electrical or magnetic moments.

It follows from the 3rd law that both ∂ {\displaystyle {\tfrac {\partial S}{\partial T}}} and ∂ {\displaystyle {\tfrac {\partial S}{\partial V}}} T\to 0must vanish for and sufficiently rapidly, which (as can be shown) is satisfied only if quantum physics, rather than classical physics, holds for low temperatures.

Questions and Answers

Q: What is the entropy of an object?



A: The entropy of an object is a measure of the amount of energy which is unavailable to do work, and also a measure of the number of possible arrangements the atoms in a system can have.

Q: What is the relationship between entropy and uncertainty/randomness?



A: Entropy is a measure of uncertainty or randomness, as the higher the entropy of an object, the more uncertain we are about the states of the atoms making up that object because there are more states to decide from.

Q: Can the entropy of an object or system be made smaller without work?



A: No, a law of physics says that it takes work to make the entropy of an object or system smaller; without work, entropy can never become smaller – everything slowly goes to disorder, which means higher entropy.

Q: Where did the word entropy come from?



A: The word entropy came from the study of heat and energy between 1850 to 1900, and it produced some very useful mathematical ideas about probability calculations which are now used in information theory, statistical mechanics, chemistry, and other areas of study.

Q: What does entropy quantitatively measure?



A: Entropy simply measures what the second law of thermodynamics describes: the spreading of energy until it is evenly spread.

Q: How does the meaning of entropy differ in different fields?



A: The meaning of entropy varies in different fields, and it can mean different things, such as information content, disorder, and energy dispersal.

Q: What is the role of entropy in probability calculations?



A: Entropy provides a mathematical way to quantify the degree of disorder or uncertainty in a system, which is useful in probability calculations.

AlegsaOnline.com - 2020 / 2023 - License CC3