Mathematics and quality control
Mathematics plays a critical role in quality control, which is the practice of ensuring that products and services meet specific quality standards. This involves statistical methods to monitor and evaluate the production processes, ensuring stability and conformance to established criteria. Quality control relies on the concept of “in control” processes, where the majority of outputs consistently meet quality expectations. Various historical figures, such as Walter E. Shewhart and W. Edwards Deming, contributed foundational ideas and tools, including control charts that help identify variations in quality—both common-cause and special-cause variations.
Control charts are visual tools that plot quality measurements over time, helping to distinguish between acceptable variations inherent in processes and those that signal potential issues. The evolution of these methods led to broader approaches like Total Quality Management (TQM), which emphasizes organization-wide quality improvement and management involvement. Additionally, strategies such as Six Sigma emerged, focusing on reducing variation and achieving near-perfect quality. As global concerns about product quality continue to grow, the mathematical underpinnings of quality control remain essential in various industries, reflecting a commitment to excellence and reliability in goods and services.
Mathematics and quality control
Summary: Industrial productions and processes can be mathematically studied to help ensure quality.
Statistical quality control, or more broadly, quality assurance, seeks to improve and stabilize the production and delivery of goods and services. A central concern of quality control is the testing and reporting of measurements of quality—typically as part of a monitoring process—to ensure that the quality of the item being studied meets certain standards.
Quality standards are determined by those who produce the goods or services. Some standards are specification limits imposed by engineering or design concerns that define conformance to a standard. For example, in making airplane engines, a certain part may need to have a diameter between 12 and 14 millimeters or it will not fit into a housing. However, for many processes, there are no specification limits and quality standards may be defined internally from data on past behavior of a process that is judged to be “in control” or “stable.” For example, in examining the safety of a large production line, it may be that in each week of the last five years, the average number of person hours lost to accidents has been 1.3. There is no specification limit for this quantity, but control limits can be based on this historical average.
In order to analyze a process for statistical quality control effectively, a process must first be declared to be “in control.” To be in statistical control, the vast majority of the products or services must be of sufficient quality for the producers to be satisfied. Moreover, the process must be stable (the mean and variance of the quality measurements must be roughly constant). If a process is in control, then statistical analysis can provide meaningful control limits to the process for monitoring. Graphical methods play a significant role in statistical quality control.
History
Some measure of quality control was in evidence during the building of the Great Pyramids of Egypt. Archeologists have long been impressed not only with the complexity of the construction process, but also by its precision. In the Middle Ages, medieval guilds were formed, in part, to ensure some level of quality of goods and services. The use of statistical methods in quality control—also called “statistical process control” or (SPC)—is more recent, with most of the development in the twentieth century. Graphical methods for quality control were introduced in a series of memos and papers in the 1920s by Walter E. Shewhart of Bell Telephone Laboratories. The charts he developed and promoted are known today as “Shewhart control charts.” H. F. Dodge and H. G. Romig, also of Bell Laboratories, applied statistical theory to sampling inspection, defining rules for the acceptance of many products. Joseph M. Juran, whose focus was more on quality management, rather than SPC, was another early quality pioneer at Bell Laboratories and later Western Electric.
W. Edwards Deming applied SPC to manufacturing during World War II and was instrumental in introducing these methods to Japanese industry after the war ended. He and Juran are generally credited with helping Japanese manufacturing shed the negative image that “made in Japan” had in the 1950s and transforming the country into a source of high quality goods consumed all over the world. In the early twenty-first century, quality control issues continue to appear in the media as concerns proliferate over the quality of goods produced in China.
Common-Cause and Special-Cause Variation
Shewhart and Deming defined two types of variation that occur in all manufacturing and service processes in their 1939 book Statistical Methods from the Viewpoint of Quality Control. A certain amount of variation is a part of all processes and can be tolerated even when the goal is to produce goods and services of high quality. This variation is called “common-cause variation,” and it comprises all the natural variation in the process. The second variation, called “special-cause variation,” is unusual and is not part of the natural variation. Special-cause variation needs to be detected as soon as possible. Quality control charts are designed to detect special-cause variation and distinguish it from common-cause variation.
Quality Control Charts
A quality control chart plots a summary of the quality measurements from each item (or a sample) in sequence against the sample number (or time). A center line is drawn at the mean, or at the desired center of this statistic. Upper and lower control limits are drawn indicating thresholds above or below which will signal an “out of control” measurement. Sometimes, various warning lines are drawn as well, and a variety of rules for deciding if the measurement is really out of control are available. The simplest chart, called an “individual” (or “runs”) “chart,” plots a single measurement for each item. The control limits are based on the Normal probability model, which implies that for a process in control, only 0.27% of the observations will lie more than three standard deviations (σ) from the center. Therefore, if the process stays in control, a false alarm will occur only once in about 1/0.0027 or once every 370.4 observations. The central idea of a control chart is that a special cause will cause the mean to shift (or the standard deviation to increase), and so the measurement will fall outside the 3σ limits with higher probability. If the shift is great enough, the time to detection will be very short. However, if the special cause results in a subtle shift, it may take many observations before such a signal is detected. Various other types of charts are available that have generally better performance in terms of both false alarm rates and failure to detect shifts.
Total Quality Management and Philosophy
The ideas of Deming, Juran, Shewhart, and others have inspired numerous other people and quality movements. One such movement is total quality management (TQM) also known as “total quality” and “continuous quality improvement.” As the name implies, this approach to quality involves more than the monitoring of manufacturing or service processes. It includes all parts of the organization and, specifically, the role of management to help ensure that in providing goods or services, that “all things are done right the first time.” Implementing these ideas throughout a large organization gave rise to an abundance of books, experts, and quality “gurus” in the latter part of the twentieth century. One approach to total quality focuses on reducing variation (decreasing σ). If the common-cause variation can be reduced enough, while the process is in control, essentially no measurements will fall outside the 3σ limits. This notion is the essential idea behind the 6σ approach, first popularized by the Motorola company and later the General Electric Company in the 1980s. By the late 1990s, a majority of the Fortune 500 companies were using some form of the 6σ approach.
Bibliography
Deming, W. Edwards. “Walter A. Shewhart, 1891–1967.” American Statistician, 21 (1967).
———. Out of the Crisis. Cambridge, MA: MIT Press, 2000.
Juran, Joseph M. Quality Control Handbook. New York: McGraw-Hill, 1999.
———. Management of Quality Control. New York: Joseph M. Juran, 1967
Snee, Ronald D., and Roger W. Hoerl. Leading Six Sigma: A Step-by-Step Guide Based on Experience With GE and Other Six Sigma Companies. Upper Saddle River, NJ: FT Press, 2002.