Entropy
This article describes the physical term from thermodynamics. For other meanings, see Entropy (disambiguation).
Entropy (artificial word Ancient Greek ἐντροπία entropía, from ἐν en 'on', 'in' and τροπή tropḗ 'turn') is a fundamental thermodynamic quantity of state with the SI unit Joule per Kelvin (J/K).
All processes that occur spontaneously within a system cause an increase in its entropy, as does the addition of heat or matter. Such processes are e.g. mixing, heat conduction, chemical reaction or conversion of mechanical into thermal energy by friction (see dissipation, energy devaluation). The entropy of a system can only decrease through the release of heat or matter. Therefore, in a closed system (a system in which there is no exchange of energy or matter with the environment), entropy cannot decrease, but can only increase over time (Second Law of Thermodynamics). Processes in which entropy increases in a closed system cannot proceed in the reverse temporal direction without external intervention; they are called irreversible. In order to bring a system back to its initial state after an irreversible process, it must be coupled with its environment, which absorbs the increase in entropy and thereby also changes its own state.
For example, we observe that in a system consisting of a cold and a hot body in an insulating box, i.e. in a practically closed system, heat transport begins and the temperature difference disappears. Both bodies will have the same temperature after a certain time, with which the system will have reached the state of greatest entropy. In such a closed system, we practically never observe the spontaneous cooling of the colder body and the heating of the warmer one.
In statistical mechanics, the macrostate of a system, which is defined exclusively by macroscopic thermodynamic quantities, is the more probable the higher the number of microstates that can realize it and that can merge into each other through internal processes. This number therefore determines the entropy of the system in this macrostate. In a system left to itself in any initial state, the spontaneously occurring internal processes then cause the state of the system to approach with the greatest probability that macrostate which, with the same energy, can be realized by the greatest number of different microstates, i.e. which has the highest possible entropy.
This is often colloquially described by saying that entropy is a "measure of disorder". However, disorder is not a well-defined physical concept and therefore has no physical measure. It is more correct to think of entropy as an objective measure of the amount of information that would be required to infer the actual microstate of the system from an observable macrostate. This is what is meant when entropy is also paraphrased as a "measure of the ignorance of the states of all individual particles".
Play media file When ice melts, the ordered ice crystal structure is transformed into a disordered movement of individual water molecules: The entropy of the water in the ice cube increases in the process (Rudolf Clausius 1862)
Historical overview
In the history of physics there was for a long time a dispute about the meaning of the concept of heat: one side held the theory that the phenomena of heat were solely due to the vis viva ("living force" = kinetic energy) of the atoms; the other claimed that heat was a substance, which was given the name caloricum (French calorique, English caloric).
Antoine Laurent de Lavoisier distinguished chaleur (heat) from calorique (caloricum) in 1789. Among other things, the caloricum was supposed to cause a repulsive force between the atoms of a solid, so that if a sufficient amount of caloricum was supplied, the solid would first become liquid and then gaseous. Together with Pierre Simon Laplace, he constructed an ice calorimeter. Lavoisier and Laplace, however, did not wish to determine whether the vis viva or the caloricum substance was the cause of the thermal phenomena. Joseph Black distinguished temperature from quantity of heat, partly on the basis of latent heat in melting. He remarked that the quantity of heat must be carried along with the steam escaping from a boiler.
Benjamin Thompson, Imperial Earl of Rumford, in his Munich days in 1798, investigated the temperature of chips produced in the drilling of cannon barrels. Due to the arbitrarily large amount of heat that could be generated from the mechanical drilling work, he doubted that the caloricum could be a (preserved) substance, thus giving a boost to the proponents of the vis viva theory.
The eponym of the Carnot process, Nicolas Léonard Sadi Carnot, wrote in 1824 that the power of a steam engine is not due to the consumption of calorique, but to its transport from a warm body to a cold one, thus preparing the concept of entropy. The experiments of Robert Mayer and James Prescott Joule in the early 1840s demonstrated that mechanical work could be quantitatively converted into heat. This was the basis for Hermann von Helmholtz's general formulation of the law of conservation of energy in 1847, i.e. the first law. Since then, the physical term heat has been fixed to its energetic meaning.
Another 20 years later, however, Rudolf Clausius found that when the form of energy heat is transferred, a second quantity-like quantity must also flow. He saw this quantity as the cause of disgregation during melting and called it entropy. As worked out by Wilhelm Ostwald in 1908 and Hugh Longbourne Callendar in 1911, Clausius' entropy corresponds to Lavoisier's and Carnot's calorique.
In 1875, Ludwig Boltzmann and Willard Gibbs succeeded in giving entropy a statistical definition which explains the previously macroscopically defined quantity in microscopic terms. The entropy of a macrostate is calculated by the probabilities the microstates :
The proportionality factor is the Boltzmann constant, but Boltzmann himself did not determine its value.
Entropy defined statistically in this way can be usefully applied in many contexts.
Connections between entropy and information emerged as early as the 19th century through the discussion of Maxwell's demon, a thought experiment that became topical in the context of miniaturization in the computer age. Computer science uses Shannon's information entropy, which corresponds to statistical interpretation, as an abstract measure of information without direct reference to physical realization. Norbert Wiener also used the concept of entropy to describe information phenomena, but with the opposite sign. The fact that Shannon's convention has prevailed is mainly due to the better technical usability of his work.
Classical thermodynamics
In thermodynamics, a system can exchange energy with its environment in two ways: as heat or as work, whereby different variants of work exist depending on the system and process control, including volume work, magnetic work. In the course of such energy exchange, the entropy of both the system and the environment may change. Only if the sum of all entropy changes is positive, the change occurs spontaneously.
Basics
The entropy (unit J/K) is an extensive state variable of a physical system and behaves additively like the volume, the electric charge or the amount of matter when several systems are united. The physicist Rudolf Clausius introduced this term in 1865 to describe circular processes. Dividing by the mass of the system yields the specific entropy with the unit J/(kg-K) as the intensive state variable.
The differential is, according to Clausius, for reversible processes between systems in equilibrium, the ratio of transferred heat δ and absolute temperature :
This entropy change is positive when heat is added and negative when heat is removed. In this notation, uses a noncursive emphasize that it is a complete differential, unlike δ , which cannot be a complete differential because is a process variable. In this context, then, the reciprocal absolute temperature plays the role of an "integrating evaluation factor" that turns the reversibly added or removed heat, an - mathematically speaking - incomplete differential, into an associated complete differential . This makes the change in entropy for reversible processes - unlike heat added or removed - path-independent. With the definition of an arbitrary value for a reference state, the entropy thus becomes a state variable given solely by the respective state.
In this respect, entropy in reversible process control can also be defined as the "heat energy valued at ". Further on, the problem of how far the energy of a system can be converted into work is treated.
Using the first law of thermodynamics, , so that the energy change is composed of work supplied and heat, and sets for the work δ all processes possible for the experimenter by means of changing the system variables, one obtains from (1) for the change of entropy as a function of the thermodynamic variables (still in the reversible case)
Clausius also treated irreversible processes and showed that in an isolated thermodynamic system the entropy can never decrease:
where the equal sign applies only to reversible processes. is the entropy change of the system with for the entropy of the state at the beginning of the state change and for the state at the end of the process.
From (2), for closed systems where thermal energy can pass through the system boundaries, the inequality follows:
is the entropy fraction resulting from the supply of heat across the system boundary. The formula also applies to the removal of heat from the system, in which case Δ negative. Inequality (3a) becomes an equation only for purely reversible processes.
When analyzing thermodynamic systems in engineering, one often performs a balance analysis. To do this, one writes the inequality (3a) in the following form:
Here Δ is the entropy fraction that arises from irreversible processes inside the system. These include, for example, mixing processes after the removal of an internal partition, thermal equalization processes, the conversion of electrical or mechanical energy (ohmic resistance, stirring) into heat, and chemical reactions. If the irreversible processes are restricted exclusively to the dissipation of mechanical or electrical work δ , then Δ can be expressed by the work or dissipated power
If the irreversible process runs quasistatically, so that the system is always close to an equilibrium state, then (3) can also be written with time derivatives.
Here, is called the entropy transport stream and the entropy production stream.
From the first law of thermodynamics
it follows that the product represents the non-utilized part ("waste heat") in the isothermal generation of work from available internal energy Δ The maximum value of this work is the so-called free energy
.
This is an equivalent form of the 2nd main theorem.
One consequence of this is the impossibility of a perpetual motion machine of the 2nd kind. Clausius formulated:
"No cycle exists whose sole effect is to transfer heat from a colder reservoir to a warmer reservoir."
Obviously, otherwise one would have constructed an inexhaustible source of energy. If it were possible to construct such a cyclic process, one could continuously take energy from the warm reservoir and do work with it. The dissipated work would then be supplied to the cold reservoir and would again benefit the warm reservoir via the mentioned circular process. Equivalent to this is the formulation of William Thomson, the later Lord Kelvin:
"No cycle exists that takes a quantity of heat from a reservoir and turns it entirely into work."
An ideal process that can be reversed at any time without friction losses is also called reversible. Often the entropy remains unchanged during a process, Δ , well-known example is the adiabatic compression and expansion in the cycle of a Carnot machine. Changes of state with constant entropy are also called isentropic, but not all isentropic changes of state are adiabatic. However, if a process is adiabatic and reversible, it always follows that it is also isentropic.
If the heat is absorbed in a cycle at the temperature the amount of heat is released again at and if the heat absorption and release are reversible, then the entropy does not change:
; or .
From this, the maximum work done and the maximum efficiency η , the so-called Carnot efficiency, can be derived:
The Carnotian efficiency represents the maximum work output for all heat engines. Real machines usually have a considerably lower efficiency. In them, part of the theoretically available work is dissipated, e.g. by friction. Consequently, entropy is produced in a real machine and more heat is dissipated to the cold reservoir than is necessary. It therefore works irreversibly.
The third law (the so-called "Nernst heat theorem") defines the entropy of a perfectly crystalline substance, in which, for example, no spin degeneracy occurs, as zero at absolute zero:
One conclusion is, for example, that the heat capacity of a system vanishes at low temperatures and, above all, that the absolute temperature zero is not attainable (this also applies to spin degeneracy).
If a substance does not fulfil the condition of being perfectly crystalline (e.g. if there are several configurations or if it is a glass), an entropy can also be attributed to it at absolute zero (zero point entropy).
Partial derivatives of entropy
From the 2nd law follow statements about the partial derivatives of entropy, e.g. according to the temperature or the volume . Using the second law, it first holds that for reversible changes of state . Together with the first law, it follows that , because according to the first law for the internal energy it holds that the sum of the work supplied to the system under consideration δ and the heat supplied δ (individually not state functions!) yields a state function, namely the "internal energy" of the system. It was assumed that the changes of volume and temperature occur adiabatically-slowly, so that no irreversible processes are generated.
So
Where δ was used.
resp.
.
Similar relationships arise when the system depends on other variables besides density or volume, such as electrical or magnetic moments.
It follows from the 3rd law that both ∂ and ∂ must vanish for and sufficiently rapidly, which (as can be shown) is satisfied only if quantum physics, rather than classical physics, holds for low temperatures.
Questions and Answers
Q: What is the entropy of an object?
A: The entropy of an object is a measure of the amount of energy which is unavailable to do work, and also a measure of the number of possible arrangements the atoms in a system can have.
Q: What is the relationship between entropy and uncertainty/randomness?
A: Entropy is a measure of uncertainty or randomness, as the higher the entropy of an object, the more uncertain we are about the states of the atoms making up that object because there are more states to decide from.
Q: Can the entropy of an object or system be made smaller without work?
A: No, a law of physics says that it takes work to make the entropy of an object or system smaller; without work, entropy can never become smaller – everything slowly goes to disorder, which means higher entropy.
Q: Where did the word entropy come from?
A: The word entropy came from the study of heat and energy between 1850 to 1900, and it produced some very useful mathematical ideas about probability calculations which are now used in information theory, statistical mechanics, chemistry, and other areas of study.
Q: What does entropy quantitatively measure?
A: Entropy simply measures what the second law of thermodynamics describes: the spreading of energy until it is evenly spread.
Q: How does the meaning of entropy differ in different fields?
A: The meaning of entropy varies in different fields, and it can mean different things, such as information content, disorder, and energy dispersal.
Q: What is the role of entropy in probability calculations?
A: Entropy provides a mathematical way to quantify the degree of disorder or uncertainty in a system, which is useful in probability calculations.