Entropy
Entropy is a fundamental concept in thermodynamics, first introduced by Rudolf Clausius in the 1850s. It quantifies the amount of energy in a system that is unavailable for doing work and reflects the degree of disorder or randomness within that system. The second law of thermodynamics, which Clausius formulated, states that as energy is transformed from one form to another, the available energy decreases, leading to an increase in entropy. This principle illustrates that energy tends to flow from areas of high concentration to low concentration, resulting in a natural progression towards disorder in closed systems.
Entropy can also be understood in the context of time; the increase in entropy can be seen as an indicator of time's passage. For instance, as coal burns to produce electricity, some energy is lost as heat, which is a lower-quality energy state with higher entropy. Importantly, the concept of entropy is not fixed; it fluctuates based on specific events, such as natural disasters or human actions. The measurement of entropy is essential in various fields, including physics, chemistry, and biology, as it governs the behavior of energy within systems and influences processes ranging from the functioning of engines to biological evolution.
On this Page
Subject Terms
Entropy
Summary: Entropy is an indicator of the amount of energy in a system that is no longer available for additional work; it measures the amount of dispersion of heat in a thermodynamic system.
Entropy is a key concept in thermodynamics. It was first coined and utilized in the second law of thermodynamics (the entropy law) by Rudolf Clausius in the 1850s. It is an indicator of the amount of energy that is no longer available for additional work. This indicator measures the amount of dispersion of heat in a thermodynamic system, which is often considered also to be a measure of the amount of disorder, or randomness, in a system. Although the first law of thermodynamics is time-invariant—that is, the amount of energy is a constant—the second law reveals time as the change in entropy. Entropy, then, is essential for understanding energy flow throughout the universe (both for the universe of all known mass and space and for a specific “universe,” or system and its surroundings) and therefore is fundamental to numerous chemical, biological, mechanical, and physical processes.
The second law of thermodynamics, formulated by Clausius in the 1850s, governs the transformation of energy among its various forms. The law states that as energy is used in work, the amount of available, or free, energy decreases. The energy does not disappear; it is conserved, as the first law of thermodynamics stipulates; instead, the energy becomes degraded and cannot perform as much work as in its pre-work, energy-concentrated state. Clausius used the term entropy for the value of the degradation, or disorder, that occurs. As free (available) energy is transformed, there is a change from order toward disorder in the system; for energy, this manifests itself as the decrease in the amount of energy available to perform work. For example, as the energy in coal is transformed into electricity, some of the energy in the coal is transformed into a less intense state, heat.
The second law also reveals the irreversibility of energy flow and, even more abstract, of time. Unless obstructed by a barrier such as a chemical bond, energy tends to dissipate from high-potential to low-potential states, but the reverse does not occur—the amount of potential work that can be completed by an energy source does not increase as it is used. As free energy is transformed, the amount of entropy increases. Wood in a stove can be burned to release heat that will warm a home; however, the heat will dissipate from the warmer stove to the (assumed) cooler room. The temperature of the air in the room will move toward equilibrium with the surrounding environment without additional wood—an increase in the available energy—placed in the stove.
The remains of the burned wood do not have much, if any, energy available for work. On a larger scale, when energy flow stops, evolution ceases. Until that moment, time can be measured by assessing the amount of entropy; the presence of more entropy indicates a progression in time. However, entropy does not increase steadily, as specific events such as a forest fire can speed the pace of entropy’s change and others can slow it down.
Entropy multiplied by temperature measures the amount of energy that has been degraded from a state in which it is able to perform work to one in which it is not. This occurs as heat is exchanged between two systems. This principle was the basis of Sadi Carnot’s study of the physics of steam engines; heat is transferred along the thermal gradient from the hot part of the engine to the cool part, and the energy available for future work decreases while the entropy increases. In order for the engine to continue running, more free energy would need to be added.
In equation form, entropy is calculated as S = Q/T, where S stands for entropy, Q for heat exchanged between a system and its surroundings, and T is temperature. As coal is burned, it produces heat; in other words, the energy in coal is transformed from a higher-intensity and lower-entropy level to a lower-intensity and higher-entropy level. For the system, the amount of entropy is a function of the free energy and temperature. E = F + TS, where E is the total energy, F is the free energy, T is the temperature, and S is the entropy. Restated for entropy, the equation becomes: S = E/T - F.
In a closed system, entropy continues to increase as heat disperses until it reaches its maximum at an equilibrium, or steady state. At that point, there is no free energy and the system is in disorder. There is no change, and evolution ceases. Until that moment, time can be measured by assessing the amount of entropy; the presence of more entropy indicates a progression in time. However, entropy does not increase steadily; as mentioned earlier, specific events, such as a forest fire, can speed the pace of entropy’s change and others can slow it down.
Creating order in a system, measured as negative entropy (negentropy), requires the transformation of free energy. For example, building a car from many parts requires human or machine labor, both of which require energy to power the work. Order is measured in the form of a car, but during the assembly of the car, disorder is measured from the heat produced as food is transformed in the body of the human or electricity is used to power a robotic machine.
Bibliography
Atkins, Peter. Four Laws That Drive the Universe. New York: Oxford University Press, 2007.
Chaisson, Eric J. Cosmic Evolution: The Rise of Complexity in Nature. Cambridge, MA: Harvard University Press, 2001.
Müller, Ingo. A History of Thermodynamics: The Doctrine of Energy and Entropy. New York: Springer, 2007.
Smil, Vaclav. Energy in Nature and Society: General Energetics of Complex Systems. Cambridge, MA: MIT Press, 2008.