Quantum Statistical Mechanics

Type of physical science: Atomic physics

Field of study: Nonrelativistic quantum mechanics

Statistical mechanics relates the macroscopic properties of matter to its microscopic properties. From the quantum-mechanical description of individual atoms and molecules, the behavior of matter in bulk can be deduced.

89317178-89572.jpg89317178-89573.jpg

Overview

Matter in bulk, as it is usually experienced in everyday life or in the scientific laboratory, consists of an immense number of atoms--something of the order of 1023. Individual atoms and the interactions among them are governed by the laws of quantum mechanics. The goal of quantum statistical mechanics is to relate phenomena in these two diverse domains of scale: the macroscopic and the microscopic. Applied to systems in equilibrium, which is the same type of system covered by thermodynamics, the subject is often known as "statistical thermodynamics." A less developed branch is "nonequilibrium statistical mechanics," which concerns the transport of matter and energy in macroscopic systems removed from equilibrium.

Fluid dynamics (for example, hydrodynamics and aerodynamics) is, in essence, the application of nonequilibrium statistical mechanics.

Thermodynamics is based on four inductive generalizations on the behavior of matter, drawn from extensive observation of physical and chemical phenomena. From these four thermodynamic laws coupled with equations of state, which represent the specific properties of individual substances, an immense body of experimental facts can be rationalized and correlated.

Thermodynamics is concerned principally with the description of physical and chemical phenomena, with the behavior of matter and energy; the underlying mechanisms are beyond its scope. Indeed, one of the strengths of classical thermodynamics is its independence of hard-to-find details of molecular structure. In fact, much of thermodynamics was developed within a conceptual framework in which matter was regarded as a continuous medium and heat as a weightless fluid.

Statistical mechanics probes more deeply than thermodynamics: It seeks to explain as well as describe. Rather than postulating the laws governing macroscopic matter, it attempts to deduce them from the known facts of atomic and molecular behavior. Statistical mechanics is also able to derive approximate equations of state for individual substances, rather than treating these as purely experimental data.

Consider a glass of water, a familiar macroscopic system. From a thermodynamic point of view, this system can be described by only three variables, such as temperature, pressure, and volume. A detailed mechanical description of the same system, based on either classical or quantum mechanics, would involve something like 1023 variables. There is not enough paper or computer memory to record that much information; moreover, it would become outdated an instant later. Fortunately, however, the laws of large numbers can be used to analyze these data. The behavior of individual molecules may be unpredictable over a large range, but the statistical distribution of the variables describing the molecules will follow a very definite pattern. Analogously, a life insurance company cannot predict which of its individual policyholders will die within a given year, but it does have an accurate statistical profile of what may occur to a sample of several hundred thousand of them. Statistical laws become more accurate the larger the sample. The exceedingly complex microscopic behavior of a large system will, under statistical averaging and the smoothing out of fluctuations, be dramatically simplified.

Each molecule in a macroscopic sample will, according to the concepts of quantum mechanics, exist in one of a discrete set of allowed quantum states labeled by a quantum number n. The corresponding energies are denoted by En. (In practice, n might actually stand for a set of quantum numbers representing different aspects of the molecule's state, such as translation, rotation, vibration, and electronic excitation.) The fundamental question of statistical mechanics is: At a given temperature, how are the molecules distributed among their quantum states? In other words, how many molecules will be in state n = 1, how many in state n = 2, and so forth. (The number of possible values of n is also immensely large.) The answer was first deduced by Ludwig Boltzmann in the mid-nineteenth century (although then based on classical mechanics). According to the Boltzmann distribution, the number of molecules in a quantum state n is proportional to the exponential quantity

e^(-En/kT). Here, e = 2.71828 . . . , the base of the natural logarithms; En is the energy of the quantum state n; k is the Boltzmann constant, equal to 1.38 x 10-34 joule/Kelvin; and T is the temperature on the absolute or Kelvin scale. Absolute zero, the lowest conceivable temperature, is defined to be 0 Kelvin. This figure corresponds to -273.15 degrees Celsius. Room temperature corresponds to about 300 Kelvins.

In conformity with the Boltzmann distribution, the population of a molecular state (that is, how many molecules are in that state) decreases precipitously with increasing energy. The more energy-rich a molecule is, the higher on a pyramid it will stand. For concreteness, consider a molecule at 300 Kelvins. The energy difference between electronic states is typically of the order of several electronvolts, corresponding to something like 10-18 joule.

According to the Boltzmann distribution, the relative population of two states separated by this energy would be several hundred to one. Thus, the lowest electronic state, known as the ground state, has overwhelmingly the largest population. Vibrational energy levels, separated by a much smaller energy difference, can have a population ratio of twenty or thirty to one. Finally, rotational energy levels, with even smaller energy differences, can have adjacent levels with comparable populations. Only when one goes up five or six rotational levels does the population begin to fall off significantly.

At higher temperatures, the occupation of higher energy levels increases significantly.

For example, at 500 Kelvins, the higher electronic energy levels in the above instance might increase their populations to several percent of the ground-state populations. Such excited molecules are more likely to undergo chemical reactions. It is well known that most chemical reactions run faster at higher temperatures, which is the reason that some foods become edible only after being cooked.

The Boltzmann distribution was derived long before the quantum theory and was thus based on classical mechanics. A classical description remains a valid approximation when applied to entire atoms or molecules, for example in consideration of fluid pressure and transport properties. This description must be replaced by the more correct quantum-mechanical treatment, however, when considering the internal dynamics of atoms and molecules.

As illustrated in the earlier example of a glass of water, a given macrostate, which is described by three variables, for example, must be consistent with an immense number of different microstates, each described by some 1023 variables. It was also Boltzmann's idea that the actual number of microstates consistent with a given macrostate was a measure of the probability of that microstate, which is given quantitatively by the famous Boltzmann formula: S = klnW. Here, S represents the entropy of the macrosystem and W, the number of compatible microstates. In quantum- mechanical terminology, W represents the degeneracy of the macrostate.

Both the second and the third laws of thermodynamics follow from the Boltzmann formula. When a constraint on a system is removed, that system will tend spontaneously to an arrangement having a greater number of available microstates. This result corresponds to a process in which the entropy of the system increases--thus the formulation of the second law in terms of increasing entropy in spontaneous processes. Moreover, one can characterize states of higher entropy as being associated with greater freedom or disorder or randomness.

For a perfectly ordered system at absolute zero, which is merely an idealization in the real world, there should be only one possible microstate. Every molecule (or atom) would be in its ground rotational, vibrational, and electronic state and precisely in place in a perfect crystal.

Setting W equal to 1 in the Boltzmann formula and noting that ln 1 = 0, one finds that the entropy S equals zero. This is one possible statement of the third law of thermodynamics.

If the quantum energy levels En are known, either as numerical quantities or as an algebraic formula, then it is possible to calculate the partition function or sum-over-states of a system, defined as the sum over all the Boltzmann exponentials:

e^(E1/kT) + e^(E2/kT) + e^(E3/kT) +

Once the partition function is known, it is, in principle, possible to compute all the thermodynamic properties of a substance. In practice, this procedure works best either for gases, in which a high degree of randomness prevails, or for crystals, in which a high degree of order exists. The intermediate state of matter--liquids--lies somewhere in between. Thus, the statistical-mechanical theory of liquids is still incomplete and is an active area of research.

Especially at temperatures approaching absolute zero, the intrinsic angular momentum, known as the spin, of quantum-mechanical particles becomes significant to their statistical behavior. Particles having spin ½, 1½, or in general any odd multiple of ½, are known as fermions. Electrons, protons, and neutrons all belong in this category. Because only one fermion can occupy a given quantum state in a system, every fermion must occupy a different quantum state. Applied to the electrons in an atom, this is known as the Pauli exclusion principle, its consequence being the periodic structure of the elements. The other class of particles, having integer values of the spin: 0, 1, . . . , are known as bosons. There is no limit to how many bosons can occupy a given quantum state. At low temperatures, in fact, the preference is for as many particles as possible to occupy the lowest energy state. This behavior underlies the phenomenon of superconductivity and the remarkable properties of liquid helium. Near room temperature, the number of states available to a quantum system becomes immensely greater than the number of particles. The statistical behavior of both fermions and bosons can then be adequately described by the Boltzmann distribution.

Applications

Quantum statistical mechanics enables predictions to be made regarding the properties of substances from a knowledge of their constituent molecules. This ability is extremely useful when direct measurement is difficult or impossible, such as in the exhaust of a rocket, in a highly corrosive environment, in the core of a nuclear reactor, or even in the interior of a star. The partition function of a system, when it is possible to calculate or estimate it, is the key quantity.

Most directly, the partition function can be calculated if the energy levels of the molecules in the system are known. Information about molecular energies comes most often from spectroscopic techniques, such as the measurement of the absorption of electromagnetic radiation as its frequency is varied. Microwave spectroscopy determines the rotational energy levels of a molecule; infrared spectroscopy determines vibrational energies; and spectroscopy in the visible and ultraviolet regions involves electronic energy levels. In addition, structural information from techniques such as X-ray and electron diffraction can be used in the computation of partition functions.

The symmetry of a molecule is described by the enumeration of the various planes of reflection and axes of rotation that transform a molecule into an indistinguishable copy of itself.

A parameter determined by symmetry makes an important contribution to the partition function, which is often a dominant consideration in the interconversion of isomers (molecules made of the same atoms joined together in different geometric arrangements). Because symmetry is related in a very intuitive way to order and randomness, there is a very obvious connection between a molecular attribute and a thermodynamic property: the entropy. This concept has no analog in classical statistical mechanics.

Equilibrium constants of chemical reactions determine to what extent a reaction will proceed from initial reactants to final products. Equilibrium constants can be calculated from the molecular parameters of the reactants and products by the methods of statistical thermodynamics.

Often, the yield of a desired product can be maximized by changing the temperature or pressure in a direction suggested by the statistical formulas.

There is one interesting case in which statistical mechanics actually uncovered a new aspect of molecular dynamics. This aspect is related to the internal rotation in certain molecules, for example, the relative motion of the two methyl (CH3) groups in the ethane molecule (CH3-CH3). This theory was suggested by a discrepancy in the entropy and heat capacity which could not be attributed to any of the known modes of vibration or rotation.

The most difficult area of statistical mechanics concerns the behavior of strongly interacting, many-particle systems. The liquid state is in this category, as are phase transitions among the different states of matter. The most fruitful approach to such systems has been based on computer simulation, in which the detailed behavior of a sample of particles, perhaps several hundred, is calculated explicitly by a powerful computer. In Monte Carlo methods (named for the famous European gambling resort because of their use of probability), an element of randomization is built into the computations. Computer simulations have been successfully applied to phase transitions, such as liquid/solid, paramagnetic/ferromagnetic, and normal/superconducting transitions. In fact, the mutual interplay of the three approaches--theory, experiment, and simulation--has substantially enhanced scientists' understanding of the properties of matter.

One of the unsolved problems in the chemistry of life is how a protein or an enzyme, a chain containing several hundred amino acid units, folds itself up into a very definite geometrical arrangement. Were this conformation even slightly off, the molecule would be unable to fulfill its biochemical function. In this application as well, computer simulations of intramolecular motions are being carried out in attempts to uncover the underlying principles of protein folding.

Statistical systems far removed from equilibrium can exhibit highly chaotic behavior.

Examples include the weather, heart rhythms, disease epidemics, animal and plant populations, and the stock market. These systems are all contemplated in a new outgrowth of statistical mechanics known variously as "nonlinear studies" or "complexity theory." Encountered in this field are the intriguing objects of fractional dimensionality called fractals, which exhibit self-similarity at different levels of magnification.

Context

The atomic picture of matter developed into a credible scientific theory in the early years of the nineteenth century, with a major contribution by the English chemist John Dalton.

By the middle part of the nineteenth century, attempts were begun to account for the thermal and mechanical properties of matter--the nature of heat and work--in terms of the behavior of its atomic constituents. This stage of development is generally designated as the kinetic theory of matter. As early as 1738, Daniel Bernoulli had explained the pressure of a gas as being attributable to collisions of gas molecules with the walls of their container. Julius Robert von Mayer suggested in 1842 that heat was a manifestation of random molecular motion. These ideas were made quantitative by James P. Joule in 1843 with his derivation of the ideal gas law from kinetic theory. James Clerk Maxwell used probability theory to derive the distribution of molecular velocities in a gas. Boltzmann generalized Maxwell's result to a rigorous and general result on the statistical distribution of energy in a molecular system. Albert Einstein's theory of Brownian motion in 1905 removed the last vestige of doubt that matter was atomic in nature.

The conceptual framework of modern statistical mechanics is largely based on the contribution of the American mathematician and physicist Josiah Willard Gibbs. Gibbs introduced the concept of an ensemble, a hypothetical collection of systems in the same macroscopic state exhibiting a representative sample of possible microscopic states. Ensemble averages of physical properties (such as pressure, energy, and entropy) could be more readily calculated than could time averages that actually correspond to the measured variables.

The quantum theory was born in 1900 with Max Planck's theory of black body radiation. In 1906, Einstein explained the low-temperature heat capacity of crystals by treating the atomic vibrational energies as quantized. Other physicists further applied the quantum theory to derive a statistical mechanical formula for the entropy of an ideal monatomic gas.

After the definitive development of quantum mechanics in 1926, statistical mechanics was completely reformulated in terms of quantum, rather than classical, descriptions of microscopic states. Fortunately, many of the classical results remained valid, including Gibbs's concept of an ensemble. As a result, partition functions could be evaluated using quantum mechanical formulas for the rotational, vibrational, and electronic energies of atoms and molecules. Molecular spectroscopic parameters are now routinely used to calculate the thermodynamic functions of substances.

The emergence of powerful computers in the 1960's stimulated the development of computer simulations as a way to perform statistical mechanics. Considerable progress has been made using computer simulations in studies of the liquid state and phase transitions.

Principal terms

ABSOLUTE ZERO: the lowest conceivable temperature, at which molecular motion is minimized

BOLTZMANN CONSTANT: the fundamental constant of statistical mechanics, designated k, and equal to 1.38 x 10-34 joule/Kelvin

BOLTZMANN DISTRIBUTION: the way that atoms and molecules are distributed among their allowed energy levels in a macroscopic system at equilibrium

ENTROPY: a measure of the randomness in the microscopic state of a macroscopic system

EQUATION OF STATE: the relationship between thermodynamic variables (usually pressure, volume, and temperature) for a specific substance

MACROSCOPIC: pertaining to matter in bulk

MICROSCOPIC: pertaining to matter on the atomic or molecular level

PARTITION FUNCTION: the sum of the Boltzmann exponentials over all possible quantum states, which determines all the thermodynamic properties of a system

QUANTUM MECHANICS: the theory governing the behavior of atoms and molecules

THERMODYNAMICS: the branch of science dealing with heat, work, and the thermal properties of matter

Bibliography

Blinder, S. M. ADVANCED PHYSICAL CHEMISTRY. New York: Macmillan, 1969. A treatise emphasizing the conceptual foundations of thermodynamics and statistical mechanics.

Chandler, David. INTRODUCTION TO MODERN STATISTICAL MECHANICS. New York: Oxford University Press, 1987. An advanced monograph stressing some of the more modern aspects of statistical mechanics, including Monte Carlo methods.

Gleick, James. CHAOS: THE MAKING OF A NEW SCIENCE. New York: Viking Press, 1987. A highly acclaimed popular account explaining chaos, fractals, strange attractors, the Mandelbrot set, and other aspects of this new branch of mechanics.

McQuarrie, Donald A. STATISTICAL MECHANICS. New York: Harper & Row, 1976. A much-referred-to graduate-level textbook.

Nash, Leonard K. ELEMENTARY STATISTICAL THERMODYNAMICS. Reading, Mass.: Addison-Wesley, 1965. Contains an elementary exposition of statistical mechanics with chemical applications on the level of a general chemistry course.

Reif, Frederick. FUNDAMENTALS OF STATISTICAL AND THERMAL PHYSICS. New York: McGraw-Hill, 1965. A comprehensive undergraduate-level treatment of kinetic theory and statistical mechanics.

Schrodinger, Erwin. STATISTICAL THERMODYNAMICS. Cambridge, England: Cambridge University Press, 1946. A little book containing a lucid account of the principles of statistical mechanics by one of the great scientists of the twentieth century.

Essay by S. M. Blinder