Statistical Mechanics

Type of physical science: Classical physics

Field of study: Statistical mechanics

Statistical mechanics is the study of the physical properties of systems containing very large numbers of atoms or molecules. The macroscopic properties of these systems are regarded as averages over the microscopic properties of their constituents, and their behavior is therefore explained in statistical terms.

Overview

A macroscopic system, such as a gas, possesses certain quantitative properties, such as pressure, temperature, and volume. Since these describe the bulk properties of the system, they are known as macroscopic variables. The relationships between these variables are then given by a number of definite laws. The perfect gas law, for example, states that the pressure of a dilute gas multiplied by its volume is proportional to its temperature; the first law of thermodynamics states that the increase in energy of a system is equal to the heat supplied to the system, plus the work done on the system; the second law of thermodynamics states that the entropy of an isolated system always increases and reaches its maximum value in the state of equilibrium; and so on.

According to the kinetic theory of matter, such macroscopic systems are composed of extremely large numbers of atoms and molecules in rapid motion that undergo repeated collisions. The behavior of these constituents of macroscopic bodies is also determined by certain laws: Newton's laws in classical physics and the laws of quantum mechanics in quantum physics.

These laws express the relationships between certain microscopic variables, such as position and velocity.

The question then arises: How are the macroscopic variables of bulk matter related to the microscopic variables of its constituents? To put it another way, how are the perfect gas laws and the laws of thermodynamics to be explained in terms of Newton's laws or the laws of quantum theory? Given the huge number of atoms or molecules in a macroscopic system, such as a gas (on the order of 1023), it would clearly be impossible to track every single particle, solve the relevant equations for the motions of all the particles, and hence determine all the properties of the system. Statistical mechanics solves this problem by considering the macroscopic properties of the system to be averages over its microscopic ones. Thus, the laws obeyed by the macroscopic variables are regarded as being probabilistic, or statistical, in character.

Consider the pressure exerted by the air in a balloon, for example. This is a macroscopic property, which one can measure directly and easily. Yet, the pressure is caused by the air molecules hitting the inside of the elastic skin of the balloon and transferring momentum to it. The pressure one feels is actually the average over these momentum transfers, per unit area.

One does not feel the individual collisions because the number of molecules hitting the skin per second is so large. Likewise, the temperature of a macroscopic body is related to the average kinetic energy of its constituent particles. It makes no sense to talk of the temperature of an individual atom or molecule; temperature is a concept that emerges as a collective property of many atoms or molecules. Thus, pressure, temperature, and the other macroscopic variables are average properties of the distribution of motions of collections of many particles.

Thus, rather than try to determine the individual motions of the particles comprising a gas or other substance, statistical mechanics invokes mathematical statistics in order to obtain the "probability distribution" of particle motions. This is represented by a function known as the "distribution function," which gives the probable number of particles that have positions and velocities within certain arbitrary, extremely small ranges. The actual number of particles in these ranges will fluctuate about the most probable value, as the positions and velocities of individual particles change because of collisions. These random fluctuations can be observed in the phenomenon known as "Brownian motion," where very small macroscopic bodies, such as grains of pollen suspended in a liquid, can be seen to undergo small irregular movements resulting from the impact of individual molecules of the liquid.

As the molecules of the gas collide with one another and the walls of the container, their positions and velocities will be altered and hence the distribution function will change. It can be shown, mathematically, that if there are no external forces acting, this function tends toward a certain value, known as the Maxwell-Boltzmann distribution function, which is independent of both position and time. When this value is reached, there exists an overall balance between the numbers of particles gaining and losing a given change of velocity resulting from collisions. The gas is then said to be in a state of "thermodynamic equilibrium." Its macroscopic properties are constant, and the values of the pressure, temperature, and density are the same throughout its volume. Once the Maxwell-Boltzmann distribution function has been obtained, the equilibrium properties of the gas can be calculated. Hence, the thermodynamic properties of systems in equilibrium can be understood in microscopic terms.

It is observed that macroscopic systems, if left to themselves, pass irreversibly from a state of nonequilibrium to one of equilibrium. Heat will flow from a hot body to a cold one, for example, until the temperatures of the two become equal. This irreversible behavior is understood in terms of the concept of "entropy," which can be regarded as a measure of the quantity of energy unavailable for work. According to the second law of thermodynamics, the entropy of an isolated system always increases until it reaches its maximum value at equilibrium, when the distribution function takes on the Maxwell-Boltzmann form. It can be shown that this is the most probable distribution in the following sense: A system in a given macroscopic state can be in any one of a huge number of microscopic states and the "coarse-grained" description offered by the macroscopic variables cannot distinguish between these states. It cannot distinguish between the different microscopic states obtained by simply interchanging the energies of two particles, for example. The number of microscopic states corresponding to the state of equilibrium is overwhelmingly large compared with all other possible microscopic states, and hence this particular macroscopic state is more probable than any other. The probability of appreciable fluctuations away from this state is correspondingly extremely small.

More specifically, the entropy is proportional to the logarithm of the number of microstates composing a given macrostate, as specified by the set of macroscopic variables, where it is assumed that all microstates compatible with this specification of the state of the system are equally likely to occur. The number of microstates of which a given macrostate is composed is known as the "statistical weight" of that macrostate. As this number increases, so does the entropy, until it reaches a maximum. This will occur when the number of microstates corresponding to a given macrostate is at maximum, that is, at equilibrium. Thus, irreversibility can be regarded in terms of a move toward more and more probable states until the most probable macroscopic state of all--the state of equilibrium--is reached.

As a simple illustration of this result, consider a glass jar in which a layer of pepper sits on a layer of salt of equal depth. This initial configuration corresponds to a highly improbable distribution. If the jar is shaken vigorously, a gray mixture will result. If the shaking is continued, it is extremely unlikely that the original configuration will ever return. There are so many more arrangements of the salt and pepper particles that result in a gray mixture than there are that give the original configuration, that the gray state is overwhelmingly more probable. Thus, shaking the jar represents an irreversible process, in which the system evolves from less probable states to those that are vastly more probable.

In evaluating the statistical weight of a certain macrostate, due consideration must be given to the number of different arrangements of the particles that are regarded as possible. In classical statistical mechanics, an exchange of energies of two indistinguishable particles is taken to give rise to a different arrangement, whereas in quantum statistical mechanics it is not, leading to a reduction in the statistical weight. Two forms of quantum statistical mechanics arise as a result: Fermi-Dirac statistics, according to which no more than one particle can occupy a given quantum state (this is the famous Pauli exclusion principle), and Bose-Einstein statistics, which apply to particles that tend to "condense" into the same quantum state. Fermi-Dirac statistics explain a wide range of phenomena, including the nature of the periodic table of elements, the properties of atomic nuclei, the formation of white dwarf stars, and the conduction properties of metals; Bose-Einstein statistics apply to electromagnetic radiation, considered in terms of photons, and explain the superfluid behavior of liquid helium 4.

Applications

Consider the example of a column of gas, at thermal equilibrium, extending to a great height. This can be viewed as a model of the atmosphere; it is highly idealized, since the atmosphere is not at equilibrium--it gets colder as the height increases. The following question can be asked: How does the density of this idealized atmosphere change with height? Since the temperature is constant, the number of molecules per unit volume, or particle density, is proportional to the pressure. The pressure decreases with increasing altitude and therefore so must the particle density. On this basis, it is easy to show that the density decreases exponentially with height. Furthermore, the rate of decrease is proportional to the mass of the molecules: the density of heavier gases decreases more rapidly with increasing altitude than the density of lighter gases. Although this is not strictly observed in the earth's atmosphere, because it is not at equilibrium and there are winds and atmospheric disturbances that mix the gases, there is a tendency for lighter gases, such as hydrogen, to become predominant at very high altitudes.

This exponential variation of density is an example of Boltzmann's law, which states that the probability of finding molecules in a given spatial arrangement varies exponentially with the negative of the potential energy of that arrangement, divided by the absolute temperature, and multiplied by a constant term (known as Boltzmann's constant). This result explains a wide range of phenomena, such as evaporation and thermionic emission in cathode-ray tubes. More generally, the Boltzmann distribution gives the probability that a system in equilibrium at a given temperature is in a state with a certain energy as varying exponentially with the negative of the energy of that state, divided by the product of Boltzmann's constant and the temperature. This is one of the fundamental results of equilibrium statistical mechanics.

With regard to the approach to equilibrium, it is fairly straightforward to give more or less qualitative explanations of irreversible behavior. Take the example of a gas contained in a box. At equilibrium, the density will be uniform and there will be almost exactly half the total number of molecules in each half of the box. There will be fluctuations, where one half of the box contains slightly more molecules than the other half, but these will be small compared with the total number of molecules. The larger the fluctuation, the more improbable it is, since it will correspond to a macrostate for which there are fewer microstates than the number composing the equilibrium state. It is therefore extremely improbable that the gas should suddenly spontaneously change from its equilibrium state to one in which all the molecules are contained in one half of the box--so improbable, in fact, that the gas would have to be observed for a very long time, longer than the universe has been in existence, before this spontaneous change would be seen. Obviously, the gas can be specially prepared to be in this latter state: by compressing it into half of the box and inserting a partition, for example. This effects a change in the macroscopic variables of the system (by reducing the volume), and the molecules will take up the most probable configuration compatible with this new macroscopic state. When the partition is removed, however, the gas will very rapidly expand to fill the box. With the macroscopic variables returned to their original values, the configuration with all the molecules in one half of the box is now extremely improbable, and the gas will quickly pass through a sequence of ever more probable states until equilibrium is reached once more.

Nevertheless, a complete microscopic understanding of irreversibility must take into account the detailed dynamics of the processes involved and, in particular, the nature of the collisions between the particles. The simplest example is that of a dilute gas, where only binary collisions occur between the molecules. As these collisions redistribute the velocities of the molecules, the distribution function changes with time. The change in this function is given by the Boltzmann transport equation, which governs the three fundamental transport processes of diffusion, viscosity, and thermal conduction. It is based on a statistical assumption known as the assumption of "molecular chaos," which refers to the absence of any correlations--either in position or velocity--between the two particles involved in the collision.

This is a plausible assumption to make in the case of a dilute gas because the particles are traveling freely most of the time, and it is very unlikely that a recollision will occur between the same two particles. Hence, particle correlations are negligible. For a dense gas, or liquid, however, a large proportion of the molecules will be undergoing collisions at any given time, and a number of these collisions will be recollisions, so the positions and velocities of the molecules may be correlated. In these cases, molecular chaos can no longer be assumed and the Boltzmann equation no longer holds. Although attempts have been made to generalize this equation by including a correlation function, most of the results that have been obtained have been restricted to certain special and highly idealized cases. The statistical mechanics of fluids is, in general, extremely complex and relies heavily on computer simulations.

Context

The development of statistical mechanics can be traced back to the work of James Clerk Maxwell and Ludwig Boltzmann. Maxwell demonstrated that the effect of collisions was to produce a statistical distribution of particle velocities in which all velocities would occur with a known probability. This is given by the velocity distribution function, first introduced in 1859.

In a memoir published in 1866, Maxwell showed that the fundamental linear transport processes in a gas, involving diffusion, heat conduction, and viscosity, could all be conceptualized as special cases of a generalized transport process, in which a physical quantity is carried by molecular motion and transferred from one molecule to another by collisions.

The Boltzmann transport equation, obtained by Boltzmann in 1872, is a special case of Maxwell's result, in which the physical quantity of interest is the velocity distribution function.

The first definitive solution of this equation was obtained in 1917 by D. Enskog, although a similar method of solution for Maxwell's original results was obtained one year earlier by Sydney Chapman. The Chapman-Enskog method, as it is now known, establishes a connection between the microscopic behavior of particles and the hydrodynamic behavior of dilute gases.

Much of modern research in this field concentrates on obtaining solutions to generalized forms of the Boltzmann equation for nondilute fluids.

In 1872, in an application of his own equation, Boltzmann discussed the behavior of the entropy function in irreversible processes by considering the way in which the velocity distribution changes with time because of interparticle collisions. By considering the details of collision processes, he was able to deduce that a certain quantity, known as the "H-function," must always decrease or remain constant. This function was then shown to be proportional to the negative of the entropy in the equilibrium state, thus providing a molecular interpretation of the second law of thermodynamics and, consequently, of irreversible phenomena.

Nevertheless, criticism of this result, in terms of the impossibility of explaining irreversibility solely in terms of time-reversible Newtonian mechanics, forced Boltzmann to acknowledge the statistical nature of the second law of thermodynamics. In 1877, he considered the distribution of particles over all possible energy states and was able to show not only that the equilibrium distribution (as given by the Maxwell-Boltzmann distribution function) was overwhelmingly the most probable one but also that the second law could be understood as the statement that systems tend to evolve from less probable states to more probable ones.

Boltzmann's combinatorial approach was subsequently used by Max Planck to explain the distribution of energy in electromagnetic radiation at thermal equilibrium. The division of the energy into discrete "quanta" involved in this explanation led to the development of quantum theory.

The consideration of collections, or "ensembles," of particles was further developed and generalized by Josiah Willard Gibbs, who coined the term "statistical mechanics" in 1901.

Boltzmann's approach of 1877 effectively introduced the "microcanonical" ensemble, involving systems of constant energy. Gibbs went on to discuss the "canonical" ensemble, in which the temperature is kept constant but the energy fluctuates, with these fluctuations being, in general, extremely small, and the "grand canonical" ensemble, in which the temperature is kept constant but the number of particles in the system is not.

Despite the power of this approach and the clear exposition given by Paul and Tatiana Ehrenfest, who demonstrated the connection between Boltzmann's work on the H-function and the combinatorial approach involving the microcanonical ensemble, a number of fundamental questions remain. These questions have centered on the assumption of molecular chaos, invoked by Boltzmann to explain irreversible phenomena. Certain modern approaches to irreversibility avoid probability arguments and attempt to derive irreversibility on the basis of a consideration of the dynamic instability of systems of many particles. The central idea here is that two such systems, whose initial states are very similar--in fact, arbitrarily so--may evolve with time into states that are very different. This idea is the basis of the theory of "chaos," which has been developed to explain a wide range of phenomena, from atmospheric turbulence to the beating of the human heart. Although much attention has been given to this kind of approach, a number of fundamental problems remain. In general, it can be said that although the problems of equilibrium statistical mechanics are essentially mathematical in nature, those of the nonequilibrium theory are ones of principle, involving the very foundations of the subject.

Principal terms

DISTRIBUTION FUNCTION: gives the probable number of particles having positions and velocities (or more generally, energies) within arbitrarily small ranges

ENTROPY: a measure of the energy of a system that is unavailable for work

EQUILIBRIUM: the state in which the entropy is a maximum

IRREVERSIBLE PROCESS: a process for which the total entropy increases

MAXWELL-BOLTZMANN DISTRIBUTION: the form adopted by the distribution function when the system reaches equilibrium

MOLECULAR CHAOS: a fundamental assumption that asserts the absence of correlations between the particles of the system

STATISTICAL WEIGHT: the number of microstates encompassing a given macroscopic state

TRANSPORT PROCESSES: processes that take place when a fluid (gas or liquid) is in a nonuniform state and some physical property--such as mass, density, or thermal energy--is transported, by collisions, through the fluid

Bibliography

Brush, Stephen G. THE KIND OF MOTION WE CALL HEAT: A HISTORY OF THE KINETIC THEORY OF GASES IN THE NINETEENTH CENTURY. 2 vols. New York: Elsevier, 1976. A comprehensive account of the history of kinetic theory and statistical mechanics up to the beginning of the twentieth century. The discussions of Maxwell and Boltzmann's work is particularly illuminating.

Cohen, E. G. D. "The Kinetic Theory of Fluids: An Introduction." PHYSICS TODAY 37 (January, 1984): 64-73. A useful survey of attempts to explain the hydrodynamic behavior of dense fluids in statistical mechanical terms. The contrast with dilute fluids is very well drawn and the usefulness of computer simulations is emphasized.

Feynmann, Richard P., Robert B. Leighton, and Matthew Sands. THE FEYNMANN LECTURES ON PHYSICS. Vol. 1. Reading, Mass.: Addison-Wesley, 1963. This classic set of lectures contains a useful introduction to the principles of statistical mechanics, with an emphasis on applications. There is, however, little discussion of irreversibility from a statistical mechanical as opposed to a thermodynamical point of view.

Mandl, Franz. STATISTICAL PHYSICS. New York: John Wiley & Sons, 1971. This is an excellent exposition of the fundamental principles of statistical physics and includes a number of interesting and useful examples of their application. The discussion of quantum statistical mechanics is particularly clear.

Prigogine, Ilya. FROM BEING TO BECOMING: TIME AND COMPLEXITY IN THE PHYSICAL SCIENCES. San Francisco: W. H. Freeman, 1980. Written by a Nobel Prize winner, this is an introductory account of a novel approach to nonequilibrium statistical mechanics that emphasizes the role of fluctuations and the open nature of self-organizing systems.

The Behavior of Gases

Laws of Thermodynamics

Essay by Steven R. D. French