Thermodynamics

Type of physical science: Classical physics

Field of study: Thermodynamics

Added heat can alter the temperature, pressure, entropy, and other properties of a system, thus being stored in the system, and it can cause the system to do work on the surroundings of the system. The laws of thermodynamics relate the heat to the stored energy and work done.

89317248-89661.jpg89317248-89784.jpg

Overview

Thermodynamics deals mainly with the energy content and production of systems. The "systems" are most often enclosed quantities of matter (solids, liquids, gases, plasmas, and mixtures of these components). In these cases, the laws of thermodynamics are developed to describe changes in properties of these systems as different forms of energy flow into or out of the system. The two principal kinds of energy of most concern are heat added to or taken from the system, and work done by the system on the confining environment. Theoretical thermodynamics can accommodate more abstract systems, such as confined (massless) radiation or fields or more subtle phases of matter, and even has extraordinary power in nonphysical systems, such as economic models (where economic analogies to thermodynamic terms and laws can be constructed).

The "properties" of a system can be virtually any measurable attribute, such as degree of magnetization or concentration of a dissolved chemical, but are usually illustrated by a basic discussion of a few typical measurables. There are two groups: the "coordinates" (pressure of the system on the environment, volume of a fixed mass of the system, temperature at points in the system, and entropy of the system) and the "potentials" for the system (internal energy, enthalpy, Gibbs function, and Helmholtz function). These potentials represent various forms in which energy is "stored" or "released" by the system as the coordinates change. For example, if a gas is compressed such that no heat is added from the environment, the temperature will rise, and the internal energy is said to rise to represent the energy stored in the gas. This example represents the main application of the "first law of thermodynamics," a thermodynamic version of the law of conservation of energy. Generalizing from mechanics, it includes energy dissipated by friction in considerations of energy balance (a bouncing ball eventually settles down and loses its energy of motion and position to heat energy). One advantage of thermodynamics is that it accounts for energy conversions when more than one coordinate changes at a time (for example, simultaneous change in pressure and temperature). The resulting change in the other properties (density, internal energy, and the like) can be computed from the thermodynamic laws.

The properties of a system are usually described in terms of a few basic familiar properties. Temperature is defined as the reading on a standard thermometer, although the second law of thermodynamics allows a more abstract definition. Pressure is defined as the mechanical force exerted by the system on its environment (or container) divided by the area over which this force acts. This definition allows the work done by the system on its environment to be computed as simply the product of the pressure and the volume change. This work only includes the mechanical energy delivered by changing the system, but it is of the most interest, since it represents the useful output of a system, while the heat added represents the expense of energy input. In some arrangements, such as refrigeration, the useful energy is the heat transferred and the expense is the work done (say by compression by a motor). In either case, the energy balance of heat, work, and (occasionally) other kinds of energy transfer is the focus of concern.

Heat, the major activating energy of systems, is described practically in terms of its effects on standard measuring systems. For example, one standard calorie is defined as the amount of heat required to raise the temperature of a standard substance (water) by 1 Kelvin.

Another substance exposed to this same standard heat will have a temperature rise characteristic of its composition. The heat capacity of any substance is then its heat absorbed per degree of temperature rise. Experiments (originated by James Prescott Joule) can relate the standard calorie to the unit of energy used in mechanics. By experiment, 1 calorie equals about 4.2 joules. The more abstract use of the laws of thermodynamics allows a more abstract definition of heat, and this definition is at the heart of the theoretical structure of the subject.

There are three central laws of thermodynamics. The first is the conservation of energy (often used to define the concept of internal energy). It asserts that the heat added to a system must reveal itself in the sum of work being done by that system and a rise in internal energy of the system. The system can respond to an input of heat in many ways under this rule, changing its coordinates to conserve energy. In an isochoric (constant volume) process, for example, no work is done by the system (since work done is defined through a change in volume), and all the heat goes into raising the internal energy. Energy relations in an isobaric (constant pressure) process are often theoretically revealing, since many chemical processes are isobaric, particularly if they occur on a tabletop where the atmospheric pressure is constant. Adiabatic processes, where no heat is added, are the most useful, for they describe how the internal evolution of a system might proceed. This latter process becomes more interesting from the point of view of the second law of thermodynamics, the centerpiece of most discussions of the subject.

There are several statements of the second law of thermodynamics, which are all equivalent. The simplest statement asserts that heat cannot be entirely converted to work by a system. For example, a bouncing ball can eventually come to rest, thus converting all of its mechanical energy (or work) into heat; it heats up as it settles. Nevertheless, one does not expect to heat a resting ball and have it start bouncing; the heat cannot be entirely transformed into work. An equivalent statement asserts that heat will not flow spontaneously from a cold object to a hot object; an ice cube on a hot pavement is expected to warm, not cool. Another version asserts that when systems are out of equilibrium—that is, when parts of a system are at different temperatures—the temperature changes to reduce the differences (toward equilibrium). In the case of the ice cube, one expects the pavement to cool a little while the ice cube heats up, not vice versa. The justification for asserting these laws can be either a statement about the observed properties of natural systems or a more abstract imposition of the second law as a condition of intelligibility of a system. In either case, the second law has fascinating and profound consequences.

A particularly convenient view of the second law of thermodynamics is provided by defining a new property of the system, a coordinate called entropy. Returning to the ice cube example, the entropy is defined by the amount of heat entering the cube (in a short instant) divided by the temperature (at that instant). If one claims that the second law requires the heat to flow from the pavement to the cube (not in reverse), then one notices that the entropy received by the cube is greater than that given up by the pavement. This is simply by definition, for the heat given and taken was the same (by the first law of thermodynamics), but the temperatures were different. The lower-temperature ice cube got more entropy (heat divided by temperature) than the hotter pavement gave up. In general, the second law of thermodynamics asserts that the entropy of a "universe" (system plus its surroundings) must always increase as the system tends toward equilibrium with its surroundings. This is a much-discussed aspect of the second law, as it provides an "arrow of time" for physical systems—that is, a natural direction of evolution—the reverse of which is prohibited. Quantitatively, the second law allows the prediction of the particular equilibrium state toward which chemical and other interactions will tend. This is immensely practical in studying such diverse systems as chemical reactions, heat flow in low-temperature solids, nuclear reactions, and even information flow in the analog nonthermodynamic systems of information theory.

Another abstract view of the first two laws of thermodynamics is that they prohibit certain processes in nature. Perpetual motion machines that deliver more work out than they take heat in, or that cycle work and heat without "running down," violate the laws of thermodynamics.

There are extensions of the basic arguments. Some natural systems, particularly biochemical ones, do seem to cycle from low- to high-entropy states, and this has led to an extension of the laws of thermodynamics to "nonequilibrium" systems, where self-organizing systems are permitted to tend toward ultimate equilibrium in surprising ways, sometimes lowering local entropy, as in the case of spontaneous organization ("de-evolution") of a genetic system from a state of molecular disorder to one of symmetric genetic material.

There are other forms of system energy considered besides internal energy, such as enthalpy, a combination of internal energy and the pressure and volume coordinates. Its use for isobaric (constant pressure) processes is similar to internal energy for adiabatic processes. The variants of internal energy called Helmholtz and Gibbs functions are similar, the latter being particularly useful in isothermal processes such as phase changes, where the system maintains its fixed temperature while using energy to change its physical structure, as when ice melts.

There is also a third law of thermodynamics, asserting that the rates of change in entropy will always diminish as the temperature of a system diminishes. This amplifies another of the consequences of the second law, which states that there must be a minimum (theoretical) temperature (0 Kelvin) for all equilibrium systems. The third law of thermodynamics is thus particularly useful in experimental low-temperature systems.

Applications

Thermodynamics is very general in structure. It describes systems of a wide range of compositions and a very wide variety of properties. Examples of a system include the classical enclosed vapors of a steam engine, the mixtures of sand, lime, and chemicals that make up cement, and a volume of atmosphere that is the model for a storm at sea. Any knowledge of pairs (or more) of attributes of a system (such as the temperature and density of the elements mixed to make cement) allows the specification of all the other attributes (pressure, entropy, enthalpy, Gibbs function, and the like). Applying the laws of thermodynamics predicts how the variables of the system will evolve (toward equilibrium). Thus, the density (composition) and the "hardness" of the cement is calculable, or the speed and moisture content of the storm, or the "efficiency" of the steam engine.

Historically, the applications of thermodynamics are most visible in the study of engines, and particularly in the comparisons between the efficiency of steam engines and other types of such devices using chemical or electrical systems. The competition between steam, diesel, and electrical engines for railroads was an example of a thermodynamics-driven technology. Periodically, new kinds of engines are proposed for transportation or energy production, and their proposed merits must be analyzed. New combinations of processes for internal combustion (such as in the Wankel engine of the mid-twentieth century) or new working systems (such as the use of fuels such as methane, ethane, or hydrogen) are periodically proposed and require thermodynamic description.

Studies of weather patterns provide excellent opportunities for the thermodynamicist. A mass of wet air moving up a mountainside seems like an obvious "system," its pressure and temperature and the density of its moisture being some of its thermodynamic coordinates. The work done against gravity and the heat added by sunlight and the surrounding atmosphere raise its internal energy (and its other "potentials," such as Gibbs function), and the heat exhausted in the form of rain or snow represent calculable quantities in a model of the mass of air. In practice, the atmosphere is so complicated that the most advanced computers are required to keep track of all the details of a weather front, but the principles governing the computer's computations of detail are the simple principles of thermodynamics.

Low-temperature science and technology, from the early liquefaction of gases to the modern exploitation of the electrical peculiarities of very-low-temperature systems, have required extensive use of the principles of thermodynamics. In particular, the so-called superfluidity of helium at ultra-low temperatures (below 3 Kelvins) has provided both a theoretical challenge and a technical opportunity to investigate allied properties of matter—superconductivity (electrical conduction) and heat conduction at low temperatures.

Hydrodynamics (fluid flow), particularly highly frictional turbulent flow, can be organized by thermal principles, though much is still unknown about the transition from smooth to erratic (even chaotic) patterns of flow. Analogies to eddies and whirlpools in fluid flow can be mathematically constructed in other pictures, such as field-theoretical models of the flow of light or particles in the subatomic regime, and so algebraic models of radiation fields carry the models of thermodynamics into fundamental particle theory.

Practical thermodynamic engineering is more directly illustrative of the basic principles of thermodynamics. One can easily see the problems of the power output of electric or steam-generating plants as directly predictable from principle. Nonsteam engines and refrigerators are equally obvious candidates for analysis, as are ideas to generate work and energy in novel ways, such as solar energy, wind, and tidal power.

The characterization of forms or states of matter itself requires considerable study. The old classifications of matter—solid, liquid, and gas—have been augmented by another major class (plasma) and many subclasses. A plasma is an electrically charged gas, and has considerable utility in anything from lighting to energy generation. The subclasses of matter (phases) caused by different molecular crystalline patterns suggest ever more new kinds of matter, such as spin-glass and superfluid liquids.

Chemistry and biochemistry are perhaps the main arena of classical thermodynamics, for there the evolution and combination of diverse chemical systems are particularly sensitive to the energy balances of their successive states. Understanding the transfer of chemical energy across membrane boundaries is part of the explanation of muscle action, itself a user of chemical energy derived from nutrition and respiration (two fields of study prominent in the prehistory of thermodynamics). Rhythmic mechanisms in controlling the daily and monthly cycles of chemical changes in organisms yield secrets to application of nonequilibrium thermodynamic models.

Astrophysics and cosmology are perhaps the most dramatic arena of thermodynamic theory, for the evolution of stars and galaxies, the origins of the cosmic expansion, the explosions of stars, and the collapse of systems into black holes are dramatic cosmic examples of systems exchanging energy and matter with their surroundings—the central thematic topic of the theory. Like its companion subjects, mechanics and electrodynamics, thermodynamics is a central pillar of the kind of thinking known as physics.

Context

Modern thermodynamics arose in the middle of the nineteenth century, particularly from the work of Rudolf Clausius and Hermann Von Helmholtz. Before that time, there were many scattered investigations into phenomena of heat and temperature, but no clear theoretical central framework. In the eighteenth century, Sir Isaac Newton focused attention on thermodynamic problems (in addition to his better-known concerns), and experimental developments soon followed in measuring temperature and heat involved in chemical and digestive processes.

In the early nineteenth century, Nicolas-Leonard-Sadi Carnot formulated an early version of the heat-energy relations in his investigations into "the motive power of fire," or the nature of the power of steam engines. Carnot still thought of heat as a material, fluidlike substance (caloric) and did not have the advantage of the later notions of mechanical energy (developed by Helmholtz). Nevertheless, his consideration of the power and action of a steam engine (molded after his father Lazare-Nicolas-Marguerite Carnot's ideas of making more efficient waterwheels by minimizing the splash) contained the germs of the first and second laws of thermodynamics. His theoretical Carnot cycle would have lasting influence on the field.

Around the middle of the nineteenth century, Clausius combined the new notion of heat as a kind of motion with Carnot's theories on heat and Helmholtz's on energy. He produced the first modern statement of the first and second laws of thermodynamics. Physicists rapidly adapted the newest advances in mathematics to generalize and refine the theory. Engineers and mechanics began to investigate thermal devices and phenomena. For example, Rudolf Diesel, one of Clausius' students, developed the engine that now bears his name. Also around the middle of the nineteenth century, Ludwig Boltzmann, James Clerk Maxwell, and Clausius developed a related discourse on heat (known as the kinetic theory), comparing notions of temperature, energy, and heat with the mechanical features of mixtures of atoms and molecules in motion. This later kinetic theory eventually grew into the science of statistical mechanics (particularly in the hands of Gibbs) by the early twentieth century.

Until the success of the theory of atomic-molecular motions of Albert Einstein and Jean-Baptiste Perrin with the atomic-molecular motions known as heat, most early-century thermal inquiries were conducted with the classical thermodynamics of Clausius and Helmholtz.

These investigations involved such diverse topics as the source of volcanoes, the cooling of the earth (too fast to allow geologists their evolutionary epochs), heating of the upper atmosphere, and chemical reactions requiring or delivering heat. The latter study led to a new field of physical chemistry sponsored by Wilhelm Ostwald, himself a skeptic about the new idea of atoms. Such investigations spawned interest in liquefying gases at low temperatures (Lord Rayleigh discovered argon in this manner) and the achievement of ever lower temperatures. Walther Hermann Nernst was led to formulate the third law of thermodynamics during this quest early in the twentieth century.

In the twentieth century thermodynamic theory became ever more mathematically sophisticated and general, while the practical applications multiplied still further. Fluid flow at low temperature and solid resistance to high pressure won Nobel Prizes for their investigators. Numerous chemists and biochemists adapted the laws for their use. By the late twentieth century, new areas of nonequilibrium thermodynamics were pioneered by Ilya Prigogine.

Engineering thermodynamics also kept pace with rapid industrialization and the new age of science. New internal combustion and electric engines and generators, and Linde and Carrier's systems of air-conditioning, were precursors to twentieth century expansion of low- and high-temperature technology, energy production, polymer and biochemistry, weather studies, and space engineering.

In a sense, thermodynamics is incomplete, for its laws can be applied to ever subtler and more abstract systems. One can extend, for example, the number of independent coordinates (such as temperature and pressure) from two to three, or even to infinity, requiring perhaps new properties and potentials to be defined. It is clear that thermodynamics has even more development to come.

Principal terms

ADIABATIC PROCESS: a process undergone by a system without addition or extraction of heat

COORDINATE: any of the quantities temperature, pressure, volume, or entropy

ENTROPY: the amount of heat transferred to or from a system (during a small interval of time) divided by the temperature at which it was transferred

HEAT CAPACITY: the amount of heat an object can absorb divided by the temperature change in that absorption

INTERNAL ENERGY: energy stored by a system, indicated by a change in temperature and volume

POTENTIAL: any of the measures of stored energies measured as internal energy, enthalpy, Gibbs function, or Helmholtz function

PRESSURE: the force per unit area exerted by the system on its surroundings

SYSTEM: an amount of matter or radiation contained by a boundary

WORK: the pressure exerted on the surroundings (container) of the system multiplied by the change in volume of the system (measured in joules)

Bibliography

Angrist, Stanley, and Loren Hepler. Order and Chaos: Laws of Energy and Entropy. New York: Basic, 1967. Print.

Atkins, P. W. The Second Law. 2nd ed. New York: Freeman, 1994. Print.

Fermi, Enrico. Thermodynamics. 1936. New York: Dover, 2012. Print.

Harman, P. M. Energy, Force, and Matter: The Conceptual Problems of Nineteenth Century Physics. 1982. Cambridge: Cambridge UP, 1999. Print.

Mendelssohn, Kurt. The Quest for Absolute Zero. London: Taylor, 1979. Print.

Mendelssohn, Kurt. The World of Walther Nernst. Pittsburgh: Pittsburgh UP, 1973. Print.

"Thermodynamics." Khan Academy. Khan Academy, 2015. Web. 3 Apr. 2015.

Van Ness, H. C. Understanding Thermodynamics. New York: Dover, 1985. Print.

Zemansky, Mark Waldo, and Richard Dittman. Heat and Thermodynamics. 8th ed. New York: McGraw-Hill, 2012. Print.

Essay by Peter D. Skiff