Measurement and Units
Measurement refers to the process of quantifying physical properties, enabling effective communication and understanding of various dimensions and quantities. Seven fundamental properties are recognized: length, mass, time, electric current, thermodynamic temperature, amount of a substance, and luminous intensity. Each of these properties has a corresponding base unit, and derived units can be created by combining these fundamental units for specific applications, such as Newtons for force or square meters for area.
Throughout history, various cultures have developed distinct measurement systems, with the metric system being widely adopted for its standardization and reproducibility. The principles of measurement are rooted in the need for consistency and clarity, allowing individuals to share and comprehend information regarding physical properties accurately. As technology continues to advance, the definitions and applications of measurement units evolve, leading to greater precision and accuracy in fields like science, engineering, and trade. Understanding measurement and its units is essential across numerous industries, reflecting its fundamental role in human activity and economic exchange.
Measurement and Units
Summary
Measurement involves quantifying a physical property, an effect, or some aspect of it. Seven fundamental properties are recognized in measurement—length, mass, time, electric current, thermodynamic temperature, amount of a substance, and luminous intensity. In addition, two supplementary or abstract fundamental properties are defined—plane and solid angles. The base units for the seven fundamental properties can be manipulated to produce derived units for other quantities that are the effect of combinations of these properties. For instance, a Newton is a derived unit measuring force and weight, and a square meter is a derived unit used to measure area.
Historically, many units representing different amounts of the same properties have been used in various cultures. For example, the United States traditionally uses the US customary system (e.g., miles, cups, pints, and ounces). Most other industrialized nations use the metric system (e.g., kilometers, milliliters, liters, and grams). The metric system, developed in France in the late eighteenth century, represents the first true standard measurement system. The theory and physical practice of measurement is constant no matter what system of units is being used.
Definition and Basic Principles
Measurement has the purpose of associating a dimension or quantity proportionately with some fixed reference standard. Such an association is intended to facilitate the communication of information about a physical property in a manner that allows it to be reproduced in concept or actuality if needed. The function of an assigned unit associated with a definite dimension is to provide the necessary point of reference for someone to comprehend the exact dimensions that have been communicated. For example, a container may be described as having a volume of eight cubic feet. Such a description is incomplete, however, because it does not state the shape or relative proportions of the container. The description applies equally well to containers of any shape, whether cubic, rectangular, cylindrical, conical, or another shape. At a more basic level, there is the assumption that the person who receives the description has the same understanding of what is meant by “cubic feet” as the person who provided the description. This is the fundamental principle of any measurement system—to provide a commonly understood frame of reference that indicates the same thing to each party.

A measurement system, no matter what its basic units, must address a limited group of fundamental propertieslength, mass, temperature, time, electric current, amount of a substance, and luminous intensity. In addition, it must also be able to describe angles. All other properties and quantities can be described or quantified by a combination of these fundamental properties.
In practice, these fundamental properties can be defined relative to any randomly selected relevant object or effect. Logically, though, for a measurement system to be as effective as possible, the objects and effects selected as the defined units of fundamental properties must be readily available to as many people as possible and readily reproducible. If, for example, a certain king were to decree that the “foot” to be used for measurement was the human foot, a great deal of confusion would result because of the variability in the size of the human foot from person to person and from a person's left foot to the right foot. Should he instead decree that the “foot” would correspond to his right foot and no other, the unit of measurement becomes significantly more precise. At the same time, the decree raises the problem of how to verify the measurements taken are based on the decreed length of the foot. A physical model of the length of the king's right foot must then be made available for comparison. The same logic holds for all defined properties and units. All measurements made within the definitions of a specific measurement system are therefore made by comparing the proportional size of an effect or property to the defined standard units.
Background and History
Historically, measurement systems were generally based on various parts of the human body, and some of these units have remained in use in the modern world. The height of horses, for example, is generally given as being so many hands high. In many languages, the word for “thumb” and the word for “inch” are closely related, if not identical. In French, both words are pouce, in Hungarian, hüvelyk and hüvelykujj, in Norwegian, tomme and tommelfinger, and in Swedish, tum and tumme. Although using the thumb as a basis for measurement is convenient in that almost everyone has one, in practice, the generally accepted thumb size tended to vary from city to city, making it impossible to interchange parts made in different locations by separate artisans.
Units of measurement traditionally have been defined by a decree issued by a political leader. In ancient times, units of length often corresponded to certain parts of the human body. The foot was based on the human foot, and the cubit was on forearm length. Other units used to weigh various goods were based on the weight of commonly available items. Examples include the stone (used widely but not officially in Britain) and the grain. Invariably, the problem with such units lies in their variability. A stone's weight may be defined as equivalent to the size of a stone that a grown man could enclose by both arms, but grown men come in different sizes and strengths, and stones come in different densities. A grain of gold may be defined as equivalent to the weight of a single grain of wheat. In drier years, when grains of wheat are smaller and lighter, the worth of gold is significantly different from its worth in years of good rainfall, when grains of wheat are larger and heavier.
Measurements and units are most useful when they are standardized so that measurements mean the same thing to everyone and are comparable. One method of solving the problem of variability in measurement is to establish a standard value for each unit and regularly compare all measuring devices to this standard.
Historically, many societies required measuring devices be physically compared with and calibrated against official standards. Physical representations of measuring units were made as precisely as possible and carefully stored and maintained to serve as standards for comparison. In ancient Egypt, the standard royal cubit was prepared as a black granite rod and most likely kept as one of the royal treasures by the pharaoh's chief steward. That simple stone object would have been accorded such a status because of its economic value to trade and construction and because it helped maintain the pharaoh's reputation as the keeper of his kingdom.
Later nations and empires also kept physical representations of most standard units appropriate to trade economics. The British made and kept definitive representations of the yard and the pound, just as the French made and kept definitive representations of the meter and the kilogram after developing the metric system in the late 1700s. Early on, countries recognized that unit representatives must not be subject to change or alteration. The Egyptian standard royal cubit was made of black granite. The standard meter and the kilogram, as well as the foot and the pound, were made from a platinum alloy so they could not be altered by corrosion or oxidation.
France adopted the metric system as its official measuring system in 1795, and that system was standardized in 1960 as the International System of Units (Le Système International d'Unités, known as SI). The units of the metric system were defined based on unchanging, readily reproducible physical properties and effects rather than any physical object. For example, the standard SI unit of length, the meter, was originally defined as one ten-millionth of the distance from the equator to the north polar axis along the meridian of longitude that passed through Paris, France. In 1960, for greater accuracy, the definition of the meter was based on a wavelength of light emitted by the krypton-86 isotope. In 1983, it was changed to the length traveled by light in a vacuum during 1/299,792,458 of a second. Similarly, the length of a standard second had been defined as a fraction of one rotation of the Earth on its axis, until it was realized that the rate of rotation was not constant but rather was slowly decreasing so that the length of a day is increasing by 0.0013 seconds per hundred years. In 1967, it was formally redefined as a duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium-133 atom. As technology develops, it becomes possible to measure smaller quantities with finer precision. This capability has been the principle that permits ever more precise definitions of the basic units of measurement.
The General Conference on Weights and Measures (Conférence Générale des Poids et Mesures, or CGPM) governs the International Bureau of Weights and Measures (BIPM) and oversees changes to the International System of Units (SI). The CGPM meets every three to six years to evaluate the framework on which international measurements are based and ensure the validity and reliability of global scientific research. In the 2010s, the Kelvin scale was reevaluated and clarified. A proposal to change the kilogram's definition, a base unit of SI, based on its root in Planck's constant (h) was asserted. When the bureau implemented these changes in 2018, mole was also modified based on the Avogadro constant, and kelvin was modified based on the Boltzmann constant. The definition of ampere was modified with the elementary charge equal to 1.602176634 × 10−19 coulombs. In 2022, four new metric prefixes were introduced.
The application of measuring procedures is of fundamental importance in the economics of trade, especially in the modern global economy. In manufacturing, engineering, the sciences, and other fields, the accuracy and precision of measurement are essential to statistical process control and other quality-control techniques. All such measurement is a process of comparing an actual item's dimensions or properties with its ideal or design dimensions or properties. The definition of a standard set of measuring units greatly facilitates that process.
Applications and Products
It is quite impossible to calculate the economic effects that various systems of measurement, both good and bad, have had throughout history. Certainly, commonly understood and accepted units of weight, distance, and time have played a major role in facilitating trade between peoples for thousands of years. In many ways, all human activity can be thought of as dependent on measurement.
The study of measurement and measurement processes is known as metrology. In essence, metrology is the determination and application of more precise and effective means of measuring quantities, properties, and effects. The value in metrology derives from how the obtained information is used. This is historically and traditionally tied to the concepts of fair trade and well-made products. Measurement ensures that trade is equitable, that people get exactly what they are supposed to get in exchange for their money, services, or other trade goods, and that as little goes to waste as possible.
By far, the largest segment of metrology deals with the design, production, and calibration of the various products and devices used to perform measurements. These devices range from the simplest spring scale or pan balance to some of the most sophisticated and specialized scientific instruments ever developed. In early times, measurements were restricted to mass, distance, and time because these were the foremost quantities used in trade. The remaining fundamental properties of electric current, amount of a substance, temperature, and luminous intensity either remained unknown or were not of consequence.
Weight Determination. Originally, weights were determined with relative ease by using the pan balance. In the simplest variation of this device, two pans are suspended from opposite ends of a bar in a way that they are at equal heights. The object to be weighed is placed in one of the pans, and objects of known weight are placed in the other pan until the two pans are again at equal heights. The method’s precision depends on the bar’s ability to pivot as freely as possible about its balance point. Any resistance will skew the measurement by preventing the pans from coming into proper balance with each other.
Essentially, all balances operate on the principle of comparing the weight of an unknown object against the weight of an accurately known counterweight, or some property such as electric current that can be measured very accurately and precisely. The counterweight may not be a weight but rather an electronic pressure sensor or something that can be used to indicate the weight of an object, such as the tension of a calibrated spring or a change in electrical resistance. Scales used in commercial applications, such as those in grocery stores, grain depots, and other trade locations, are inspected and calibrated regularly according to laws and regulations that govern their use.
Length Determination. For many practical purposes, a linear device such as a scale ruler or tape measure is all that is needed to measure an unknown length. The device used should reflect the size of the object being measured and have scale markings that reflect the precision with which the measurement must be known. For example, a relatively small dimension being measured in a machine shop would be measured by a trained machinist against a precision steel scale ruler with dimensional markings of high precision. Training in the use of graduated scale markings typically enables the user to read them accurately to within one-tenth of the smallest division on the scale. High-precision micrometers are generally used to make more precise measurements of smaller dimensions, and electronic versions of such devices provide the ability to measure dimensions to extremely precise tolerances. Smaller dimensions for which the accuracy of the human eye is neither sufficient nor sufficiently reproducible are measured using microscope techniques and devices. Accurate measurements of larger dimensions have always been problematic, especially when the allowable tolerance of the measurement is very small. This has been overcome in many cases by the measuring machine, a semi-robotic device that uses electronic control and logic programming to determine the distance between precise points on a specific object.
The 1960 definition of the meter was achieved using a precision interferometer, a device that uses the interference pattern of light waves such that the number of wavelengths of a specific frequency of light can be counted. Using this device, the meter was precisely defined as the distance equal to 1,650,763.73 wavelengths of the 2p10−5d5 emission of krypton-86 atoms. In 1983, however, the meter was defined as the distance traveled by light in a vacuum in 1/299,792,548 of a second.
Time Measurement. The basic unit of time measurement in all systems is the second. This is a natural consequence of the length of a day being the same everywhere on the planet. According to the observed patterns of stars and their motions, the natural divisions of that period almost inevitably result in twenty-four equal divisions. A natural result of the metric system is that a pendulum that is one meter in length swings with a period of one second. Pendulum clockworks have been used to measure the passage of time, coordinated to the natural divisions of the day, for thousands of years. More precise time measurements have become possible as better technology has become available. With the development of the metric system, the unit duration of one second was defined to correspond to the appropriate fraction of one rotation of the planet. Until the development of electronic methods and devices, this was a sufficient definition. However, with the realization that the planet’s rotation is not constant but slowly decreasing, the need to redefine the second in terms that remain constant in time led to its redefinition in 1967. According to the new definition, a second is the time needed for a cesium-133 atom to perform 9,192,631,770 complete oscillations.
Temperature Determination. Of all properties, the measurement of thermodynamic temperature is perhaps the most relative and arbitrary. The thermometer was developed before any commonly used temperature scales, including the Fahrenheit, Celsius, Kelvin, and Rankine scales. All are based on water’s freezing and boiling points, the most readily available and ubiquitous substance on the planet. The Fahrenheit scale arbitrarily set the freezing point of saturated salt water as 0 degrees. The physical dimensions of the scale used on Fahrenheit's thermometer resulted in the establishment of the boiling point of pure water at 212 degrees. The Celsius scale designated the freezing point of pure water to be 0 degrees and the boiling point of pure water to be 100 degrees. The relative sizes of Fahrenheit and Celsius degrees are thus different by a factor of 5 to 9. Conversion of Fahrenheit temperature to Celsius temperature is achieved by subtracting 32 degrees and multiplying the result by 5/9. To convert from Celsius to Fahrenheit, first multiply by 9/5, then add 32 degrees to the result. The temperature of -40 degrees is the same in both scales.
Each temperature scale recognizes a physical state called absolute zero, at which matter contains no thermal energy whatsoever. This must physically be in the same state regardless of whether Fahrenheit or Celsius degrees are being used. Because of this, two other scales of temperature were developed. The Rankine scale, established as part of the school of British engineering, uses the Fahrenheit degree scale, beginning at 0 degrees at absolute zero. The Kelvin scale uses the Celsius degree scale, beginning at 0 degrees at absolute zero.
Temperature is accurately measured electronically by its effect on light in the infrared region of the electromagnetic spectrum, although less accurate physical thermometers remain in wide use.
Amount of a Substance. Of all the fundamental properties, the amount of a substance is the least precisely known. The concept is intimately linked to the modern atomic theory and atomic weights, although it predates them by almost a hundred years. Through studies of the properties of gases in the early 1800s, Italian Amedeo Avogadro concluded that the quantity of any pure material equal to its molecular weight in grams contained exactly the same number of particles. This number of particles came to be referred to as the Avogadro constant. Thus, 2 grams of hydrogen gas (molecular weight = 2) contains exactly the same number of molecules as 342 g of sucrose (common white table sugar, molecular weight = 342) or 18 milliliters of water (molecular weight 18, density = 1 gram per milliliter). Calculations have determined the value of the Avogadro constant to be about 6.02214 × 1023. Because not all twenty-three decimal places are known, the absolute value has not yet been determined, and therefore, the amount of a substance is the least precisely known unit of measurement.
The amount of material represented by the Avogadro constant (of atoms or molecules) is termed the “mole,” a contraction from “gram molecular weight.” The number is constant regardless of the system of measurement used due to the indivisibility of the atom in modern chemical theory. Thus, a gram molecular weight of a substance and a pound molecular weight of a substance contain a constant number of molecules.
Electric Current. Of course, atoms are not indivisible. They consist of a nucleus of protons and neutrons, surrounded by a cloud of electrons. The electrons can move from atom to atom through matter, constituting an electric current. More generally, any movement of electronic charge between two different points in space defines an electric current. Although some controversial evidence suggests that electricity may have been known in ancient times, serious study of electricity began in the eighteenth century, after the discovery of the electrochemical cell. At that time, electricity was considered a mysterious fluid that permeated matter. With the discovery of subatomic particles (electrons and protons) in 1898 and the subsequent development of the modern atomic theory, the nature of electric currents came to be better understood. The ampere, named after one of the foremost investigators of electrical phenomena, is the basic unit of electric current and corresponds to the movement of one mole of electrons for a period of one second.
The development of the transistor—and the electronic revolution that followed—made it possible to measure extremely small electric currents of as little as 10−9 amperes, as well as corresponding values of voltage, resistance, induction, and other electronic functions. This made it possible to precisely measure fundamental properties by electronic means rather than physical methods.
Measurement of Luminous Intensity. Until it became important and necessary to know precisely the intensity of light being emitted from a light source, particularly in astronomy and physics, there was no need for a fundamental unit of luminous intensity. Light intensities were generally compared, at least in post-Industrial Revolution Europe, to the intensity of light emitted from a candle. This sufficed for general uses such as lightbulbs, but the innate variability of candle flames made them inadequate for precision measurements. The candela was set as the standard unit of luminous intensity and corresponds to the energy output of 0.00146 watts. Most modern lightbulbs are rated at a certain number of watts, but this is a measure of the electric power they consume and not their luminous intensity.
Careers and Course Work
Measurement and units, as they apply to metrology, are part of every technical and scientific field. The student planning a career in these areas can expect to learn how to use the metric system and the many ways measurement is applied in a specific field of study. Because measurement is related to the fundamental properties of matter, a good understanding of the relationships between measurements and properties will be essential to success in any chosen field. In addition, the continuing use of multiple measuring systems, such as the metric, US customary, and British Imperial systems, in the production of goods means that understanding those systems and how to convert between them will be necessary for many careers.
Specific courses that involve measurement include geometry and mathematics, essentially all physical science and technology courses, and design and technical drawing courses. Geometry, which means “earth measurement,” is the quintessential mathematics of measurement and is essential for land surveying, architecture, agriculture, civil engineering, and construction careers. Gaining an understanding of trigonometry and angular relationships is particularly important. The physical sciences, such as physics, chemistry, and geology, employ analytical measurement at all levels. Specializations in which measurement plays a prominent role are analytic chemistry and forensic research. Technology programs such as mechanical engineering, electronic design, and biomedical technology heavily rely on measurement and the application of metrological techniques in the completion of projects that the engineer or technologist undertakes.
Social Context and Future Prospects
Measurement and understanding the units of measurement are an entrenched aspect of modern society, taught informally to children from early childhood and formally throughout their schooling. They are fundamental to the continued progress of technology and essential to the determination of solutions to problems as they arise and the development of new ideas and concepts. The need for individuals trained in metrology who understand the relationship and use of measurements and units will be increasingly important in ensuring the viability of new and established industries.
Particular areas of growth and continuing development include medical research and analysis, transportation, mechanical design, and aerospace. The accuracy and precision of measurement in these fields are critical to the successful outcome of projects.
Bibliography
Butcher, Kenneth S., Linda D. Crown, and Elizabeth J. Gentry. The International System of Units (SI): Conversion Factors for General Use. National Institute of Standards and Technology, 2006.
Cardarelli, Francois. Encyclopedia of Scientific Units, Weights and Measures: Their SI Equivalences and Origins. Springer, 2003.
Kuhn, Karl F. In Quest of the Universe. 8th ed. Jones and Bartlett, 2020.
Shultz, Kenneth S., et al. Measurement Theory in Action: Case Studies and Exercises. 3rd ed. Routledge, 2021.
"SI Redefinition." National Institute of Standards and Technology, 22 Feb. 2023, www.nist.gov/si-redefinition. Accessed 20 May 2024.
Tavernor, Robert. Smoot's Ear: The Measure of Humanity. Yale University Press, 2007.
"27th Meeting of the CGPM." Bureau International des Poids et Mesures, 2022, www.bipm.org/en/cgpm-2022. Accessed 20 May 2024.