Laws Of Thermodynamics

  • Type of physical science: Classical physics
  • Field of study: Thermodynamics

Thermodynamics deals with the macroscopic properties of bulk matter, which involve thermal effects in an essential way. The results of an immense body of experimental knowledge can be formulated in terms of a small number of generalized abstract principles known as the laws of thermodynamics.

89317066-89730.jpg89317066-89731.jpg

Overview

The science of thermodynamics is concerned with transformations of matter and energy in physical and chemical processes. In particular, it seeks to describe states and processes in which thermal effects—those involving the transfer of heat and changes in temperature—play a significant role. In this regard, mechanics and electromagnetic theory in their purest forms can be considered to treat only the behavior of matter and energy at the absolute zero of temperature.

The name thermodynamics originates from the Greek words thermos, meaning heat, and dynamis, meaning power.

The branch of the subject that concerns the equilibrium macroscopic properties of matter is called "classical thermodynamics," to distinguish it from "irreversible thermodynamics"—whose subject matter includes nonequilibrium properties and rate processes—and "statistical thermodynamics"—which treats the subject from a microscopic or molecular point of view. This article will be principally concerned with classical thermodynamics, which has long been a central topic of physical chemistry and also plays a significant role in various branches of physics, engineering, biology, and geology.

Thermodynamics rests on an experimental foundation of immense breadth and depth, encompassing a vast and diverse array of accumulated experience in chemistry, physics, biology, and engineering. In common with other branches of science, the specific details and individual peculiarities of a large number of experimental facts are abstracted and, through inductive reasoning, expressed as a compact set of fundamental laws. The abstract terminology of such inductive principles tends to obscure their origin in specific phenomena, but in consequence their universality and range of application becomes significantly enhanced. A set of inductively derived fundamental laws can subsequently be regarded as a system of postulates. From these there should follow, as logical consequences, all the experimental results that originally led to their formulation, as well as others not previously known.

The modern formulation of classical thermodynamics is based on four fundamental principles, designed as the zeroth, first, second, and third laws. The zeroth law of thermodynamics was enunciated by Sir Ralph Fowler in 1931. It stands most logically before the first and second laws in the theoretical framework of thermodynamics. Since the designations "first" and "second" laws were by then too well established to change, however, the new principle was dubbed the "zeroth law." The zeroth law, also called the law of thermal equilibrium, can be stated: Two systems in thermal equilibrium with a third system are in thermal equilibrium with each other. Two systems are said to be in thermal equilibrium when there is no net heat exchanged when they are brought into thermal contact. Although this principle may appear at first sight to be rather obvious and trivial, it is well to point out that the equivalent statement about "chemical" equilibrium between substances is not true. Thus, two systems unreactive with a third system are not necessarily unreactive with each other. (Consider, for example, gaseous ammonia, hydrogen chloride, and helium; neither of the first two reacts with helium, but they do react with each other to produce a white mist of ammonium chloride.)

The zeroth law leads to an operational definition of temperature: If two systems coexist in thermal equilibrium, they have the same temperature.

The temperature scale used in most scientific work is the Celsius scale, defined such that 0 degrees Celsius is the freezing point of water and 100 degrees Celsius is the boiling point (both under a pressure of 1 atmosphere). Of more fundamental significance is the absolute, or Kelvin, temperature scale. Zero on the Kelvin scale represents the lowest conceivable temperature, known as absolute zero. This corresponds to -273.15 degrees Celsius. The degree has the same size as on the Celsius scale but is now designated as 1 Kelvin. Thus, the freezing and boiling points of water are approximately 273 Kelvins and 373 Kelvins, respectively. Room temperature is about 300 Kelvins.

Friction in mechanical systems results in the evolution of heat, with an associated rise in temperature. This would be especially evident in the operation of a grindstone or in the boring of cannon barrels. In the same category is the heating effect of an electric current, in which electrical work is transformed into heat. The interconversion of heat and work was studied in the first half of the nineteenth century, with major contributions by Benjamin Thompson (later to become Count von Rumford), Sir Humphry Davy, and Julius Robert von Mayer, culminating in the definitive experiments of James Prescott Joule in 1840. These measurements determined the mechanical equivalent of heat, the modern value being 4.184 calories of heat per joule of work.

This led to the understanding of heat as a form of energy transfer. In earlier times, heat had been thought to be a material substance, variously called "phlogiston," or "caloric." The first law of thermodynamics can now be stated: The energy of the universe is constant. A thermodynamic system can indeed either gain or lose energy by heat transfer or performance of work, but this energy change must be exactly compensated by that of the surroundings.

The law of conservation of energy plays a central role in classical mechanics, electromagnetism, and quantum theory. According to this fundamental principle of science, energy can neither be created nor destroyed, but only converted from one form to another. (The theory of relativity encompasses, in addition, the interconvertibility of matter and energy, but this need not be considered in ordinary physical and chemical phenomena.) The first law of thermodynamics augments the concept of energy by taking into account "thermal energy," that concept associated with the randomized motions of individual molecules, in addition to mechanical and electrical energy. More precisely, heat represents a flow of thermal energy, analogous to work, which is a transfer of mechanical or electrical energy. Whereas work usually involves perceptible motion of part of a system, heat does not, occurring on two small a scale to be macroscopically detectable.

The first law for a thermodynamic process can be expressed in the following mathematical form: U = q + w. Here, q represents the heat transferred into the system and w, the work done by the system; U is the energy of the system. As defined, a negative value of q would mean that the system loses heat, and a negative w, that the system does work on its surroundings. The quantity U introduced by the first law of thermodynamics is a new function of state; that is, it depends only on the variables defining the state of the thermodynamic system, independent of how the system got there. By contrast, heat and work depend critically on the path taken by the system.

The second law of thermodynamics is one of the most profound and intriguing principles in all of science, with implications going far beyond its original subject. It deals with the direction of spontaneous change. It is a well-documented observation that a drop of ink will diffuse uniformly throughout a beaker of water. The reverse situation will never occur: the drop of ink will reassemble out of tinted water. More generally, all changes in the universe appear to have a natural direction, with the reverse processes usually being regarded as impossible. The second law of thermodynamics can be understood most readily from a molecular point of view. To every thermodynamic state of a macroscopic system, as described by a small number of variables, there must correspond a huge number of microscopic or molecular arrangements that behave like the very same state. Thus, exchanging two molecules in a mole of a substance (approximately 6 x 1023 molecules) will not alter any of the observable properties of the system. The essence of spontaneous change is that a transition is made to a final state, which has many more possible molecular arrangements than the initial state. The result of a spontaneous change can therefore be regarded as an increase in the "disorder" or the "randomness" of the system. In 1865, Rudolf Clausius and William Thomson (later to become Lord Kelvin) defined the thermodynamic function that precisely describes this randomness: the "entropy," symbolized by S. The original definition was based on a formula involving heat and absolute temperature (dS = dq/T). Later, in 1896, Ludwig Boltzmann postulated the relation, S= k ln W, in which W represents the number of different molecular arrangements that correspond to a given thermodynamic state: This is an immensely large number, on the order of 1023; "ln" stands for natural logarithm; ln W will be reduced to a number of the order of 1023; finally, k stands for Boltzmann's constant, equal to 1.38 x 10-23 joules per Kelvin. To illustrate Boltzmann's idea, consider the possible arrangements of a deck of 52 playing cards. (There is a total of 8 x 1067 possible ways to order the cards.)

Obviously, the number of well-shuffled arrangements of the deck will far exceed those that have a highly ordered pattern, for example, the sequence in a brand new deck: ace of spades, king of spades, deuce of clubs. Thus, a shuffled deck exhibits a higher value of entropy than an ordered one. The numbers involved in a macroscopic thermodynamic system are immensely larger, with 1023 molecules rather than 52 cards.

The second law of thermodynamics can be stated: Every spontaneous process increases the entropy of the universe. Since the "universe" includes both a system and its surroundings, there is no requirement that the entropy of the system alone has to increase. For example, in the freezing of water, the resulting ice has a decreased entropy. This is more than compensated by an increase in the entropy of the surroundings.

Some classic examples of natural processes, all associated with increasing entropy, include the flow of heat from a warmer to a cooler body; the conversion of work into heat by friction; the expansion of a gas to fill its container; the freezing of supercooled liquid water (meaning water at a temperature below 0 degrees Celsius), and the reaction of gaseous ammonia and hydrogen chloride to form ammonium chloride. Any of the preceding processes can, in fact, be reversed, but to do so would require some external intervention. For example, heat can be made to flow from a cooler to a warmer body (this is precisely what a refrigerator does) by means of compressors, and the like, which are part of a larger system. The second law of thermodynamics still holds for the composite system, even though there is an entropy decrease within one of its component parts.

The second law of thermodynamics is sometimes stated as the "law of degradation of energy," meaning that energy in some low-entropy form is spontaneously converted to energy in a higher-entropy form. The classic example is the degradation of work into heat by friction.

Modern civilization is crucially dependent on the conversion of heat into work, by means of heat engines. This reversal of energy degradation entails the rejection of heat to a cold reservoir to compensate for its conversion into work at a hot reservoir. The theory of the heat engine, based on the second law of thermodynamics, was first developed by the French engineer Nicolas-Leonard-Sadi Carnot in 1824.

A useful corollary to the second law of thermodynamics focuses entirely on the behavior of the system, independent of its surroundings. This entails definition of a function known as the "free energy," in terms of which the second law can be restated: Every spontaneous process decreases the free energy of a system. Free energy was first introduced by the American physicist and mathematician Josiah Willard Gibbs. Actually, there are two variants of free energy: for systems in which the pressure and temperature are controlled, the Gibbs energy function G is used; for systems for which the natural variables are volume and temperature, the Helmholtz energy function A is used.

The third law of thermodynamics was discovered by Walther Hermann Nernst in 1912.

Later reformulations allow the statement of the third law in either of the equivalent forms: The absolute zero of temperature is unattainable in a finite number of operations; or, as the absolute temperature is reduced toward zero, the entropy of a perfect crystalline solid approaches zero.

The state of a thermodynamic system at absolute zero can be characterized as being perfectly ordered on a molecular level. Thus, for this state alone, the microscopic arrangement is unique, corresponding to W = 1 in Boltzmann's formula. Also, since ln 1 = 0, the entropy S equals zero. According to classical mechanics, molecular motion would cease entirely at absolute zero. Quantum mechanics, the more correct theory for molecules, predicts that at absolute zero, the amplitudes of molecular motion are reduced to their lowest possible levels, but not zero.

In discussing the laws of thermodynamics, extensive use of molecular concepts have been made. It should be noted that, in its purest form, classical thermodynamics is actually independent of molecular details, or indeed of the very existence of molecules. Historically, this has been an advantage, allowing the science of thermodynamics to develop in advance of the discoveries of modern theories of molecular structure.

Applications

The laws of thermodynamics find application in many branches of science and technology. Applications to chemistry constitute the subfield called chemical thermodynamics.

From extensive tabulations of free energies of the elements and their compounds, it is possible to make predictions about the course of chemical reactions involving these substances. A pioneer in this field was the American physical chemist Gilbert Newton Lewis. In the commercially vital synthesis of ammonia by the Fritz Haber process, the temperature and pressure can be optimized in accordance with thermodynamic considerations so as to achieve maximum product yield.

Ammonia is used as a fertilizer and as a precursor for other essential chemicals.

It was realized in the latter part of the nineteenth century that the free energy change in a chemical reaction could be directly harnessed in an electrochemical cell, now commonly called a battery. Many of the advances in thermodynamics during this period were, in fact, developments in electrochemistry. Electrochemical technology has given scientists efficient and portable sources of energy. The continuing development of fuel cells promises a clean, efficient method of converting fuels directly into electricity, circumventing the thermodynamic restrictions on heat engines. Electrolysis is the reverse of the above process, whereby electrical energy is used to carry out chemical change. Electrolytic methods are extensively used in electroplating, material purification, and chemical synthesis. Electrochemical phenomena are known to play an important role in a number of essential biological processes, including conduction of nerve impulses and transport across cell membranes.

By the methods of statistical mechanics, it has become possible to make theoretical predictions of thermodynamic properties of substances, given the appropriate molecular parameters, as obtained usually from spectroscopic measurements. This has enabled the extension of thermodynamic methods to reactions involving highly unstable or exotic molecules that cannot be measured directly. Such applications have been made in atmospheric chemistry, astrophysics, and in the study of reactions involving short-lived intermediates.

In 1875, Gibbs published his definitive work on the equilibrium of heterogeneous substances. From this work comes one of the simplest, yet most powerful, results in all of science: the "phase rule," which states that F= C + P - 2, in which P represents the number of phases in a system (that is, solids, liquids, gases), C the number of components (distinct chemical species), and F the number of degrees of freedom (the number of variables such as pressure, temperature, and composition needed to specify the state of the system). The phase rule has played a key role in the development of material science, leading to the design of new materials with desired physical and chemical properties.

Context

In the overall scheme of modern science, thermodynamics has played the role of a censor somewhat. Thermodynamics has seen to it that energy is strictly conserved in all physical and chemical processes, that entropy always increases in the direction of spontaneous change, and that absolute zero is never attained. These rules are to be obeyed unconditionally whatever the detailed forms of the underlying atomic and molecular mechanics. In a sense, classical thermodynamics has now been superseded by statistical mechanics. With the development of the quantum theory of matter, augmented by statistical laws governing the immense numbers of molecules in a macroscopic sample of matter, it has become possible to "derive" the laws of thermodynamics. Still, the elegant simplicity of thermodynamic formulas, coupled with their independence of the complex details of molecular mechanics, guarantees that this subject will remain useful.

Historically, thermodynamics arose from very practical concerns on how to optimize the performance of steam engines. This was the focus of Carnot's research. The law of conservation of energy is usually credited to Hermann Ludwig Ferdinand von Helmholtz in 1847. In 1859, Clausius formulated the first and second laws of thermodynamics. Gibbs is responsible for much of the subsequent mathematical development of thermodynamics, featuring the free energy, chemical potential, and phase rule. Boltzmann and James Clerk Maxwell, in the latter part of the nineteenth century, developed the kinetic molecular theory of heat, which evolved into statistical mechanics. The twentieth century brought the flowering of chemical thermodynamics.

Irreversible, or nonequilibrium, thermodynamics remains a developing area of research.

This field considers nonequilibrium states of systems, which are beyond the scope of classical thermodynamics. Much of this subject was pioneered by the Belgian physical chemist Ilya Prigogine. Irreversible thermodynamics has important implications for the behavior of biological systems and might contain the secret of life itself.

Principal terms

ENERGY: a fundamental conserved physical attribute that a system possesses by virtue of its state of motion or interaction with other systems; operationally, the capacity to perform work or generate heat

ENTROPY: a measure of molecular disorder in a system; entropy increases in spontaneous processes

EQUILIBRIUM: a condition of stability whereby the state of a system does not perceptibly change over some period of time

HEAT: thermal energy transferred into or out of a system, commonly as a flow from a warmer to a cooler body

STATE: the condition of a thermodynamic system, as described by the appropriate physical variables, such as mass, volume, pressure, and temperature

SYSTEM: an explicitly defined part of the world being observed; the system may be isolated from its surroundings or it may be interacting in some specified way

TEMPERATURE: a measure of the thermal energy of a system

THERMAL ENERGY: energy possessed by a system by virtue of the random motions of its constituent molecules

WORK: energy transferred into or out of a system by means of some mechanical or electrical interaction with the surroundings

Essay by S. M. Blinder

Bibliography

Atkins, Peter W. The Second Law. Scientific American Library, 1984.

Bent, Henry A. The Second Law. Oxford UP, 1965.

Blinder, S. M. Advanced Physical Chemistry. Macmillan, 1969.

Hall, Nancy. "Thermodynamics." NASA, 13 May 2021, www.grc.nasa.gov/www/k-12/airplane/thermo.html. Accessed 14 Feb. 2025.

"Laws of Thermodynamics." Isaac Physics, University of Cambridge, isaacphysics.org/concepts/cc‗laws‗of‗thermodynamics. Accessed 14 Feb. 2025.

Moore, Walter J. Physical Chemistry. 4th ed., Prentice-Hall, 1972.

Nash, Leonard K. Elementary Chemical Thermodynamics and Elementary Statistical Thermodynamics. Addison-Wesley, 1962.