Matrices (mathematics)
Matrices are rectangular arrays of numbers that serve as a fundamental concept in mathematics, particularly within the field of linear algebra. They are utilized to represent coefficients in systems of linear equations, enabling efficient solutions to these equations. The history of matrices dates back over 2000 years, with significant developments occurring in both ancient China and later in Europe and Japan during the seventeenth century with the introduction of determinants. In the nineteenth century, mathematicians systematized matrix theory, which has since played a crucial role in various scientific advancements, including quantum mechanics.
Today, matrices are widely taught in high school mathematics and have diverse applications ranging from cryptography and Internet security to computer graphics and statistical analysis. Techniques such as Gaussian elimination and matrix operations allow for the manipulation and solution of linear systems. The term "matrix," derived from the Latin word for "womb," reflects its foundational nature in mathematics, where it is regarded as a single object that can be operated on collectively. Contemporary research continues to explore efficient algorithms for matrix computations, underlining their relevance in both theoretical and applied mathematics.
Matrices (mathematics)
Summary: Matrices are useful for a variety of calculations and applications.
Matrices are used throughout modern mathematics and statistics and their applications in the natural and social sciences. Matrix theory and the closely related theory of vector spaces form what is now known as “linear algebra”: the study of systems of linear equations and their solutions in n-dimensional space. A matrix is a rectangular array of numbers representing the coefficients of the unknowns in a linear system. The first example of such a system and its solution using matrix operations dates from more than 2000 years ago in China. The closely related concept of “determinants” was introduced independently in Japan and Europe in the seventeenth century. The systematic development of basic matrix theory, in both its algebraic and geometric aspects, took place in the nineteenth and early twentieth centuries. This theory played a major role in the development of quantum mechanics, the branch of physics underlying many of the technological advances of the twentieth and twenty-first centuries. Matrices have been commonly explored in high school since linear algebra became a standard topic in the mathematics curriculum during the middle of the twentieth century. Contemporary applications of matrix theory are cryptography, Internet security, and Internet search engines, such as Google.
Origin of the Term
The word “matrix” comes from Latin, meaning “womb,” deriving from mater (mother). The mathematical use was introduced by James Joseph Sylvester as “an oblong arrangement of terms consisting, suppose, of m lines and n columns, a Matrix out of which we may form various systems of determinants.” At present, the word “matrix” refers to a rectangular array of numbers regarded and manipulated as a single object.
Linear Systems and Row Operations
The first calculation with such an array dates from the Han dynasty in ancient China, in Nine Chapters of the Mathematical Art, a practical handbook on surveying, engineering, and finance. One problem posed in the handbook is this:
There are three types of corn, of which three bundles of the first, two of the second, and one of the third make 39 measures. Two of the first, three of the second, and one of the third make 34 measures. One of the first, two of the second, and three of the third make 26 measures. How many measures of corn are contained in one bundle of each type?
In modern notation, this becomes a system of linear equations, with unknowns x, y, and z representing the three types of corn:
3x + 2y + z = 39
2x + 3y + z = 34
x + 2y + 3z = 26.
The ancient Chinese author writes the coefficients in a rectangular array and solves the system by performing operations on this array. In modern notation, start with the 3-by-4 matrix of coefficients, and then (1) multiply row 2 by 3 and subtract 2 times row 1; multiply row 3 by 3 and subtract row 1; (2) multiply row 3 by 5 and subtract 4 times row 2:
The third row of the last array represents the equation 36z=99, giving z=11/4. The second row represents 5y+z=24, giving y=17/4. The first row represents 3x+2y+z=39, giving x=37/4.
Gaussian and Gauss–Jordan Elimination
This simplification of linear equations by using one variable to cancel another is called “Gaussian elimination.” Carl Friedrich Gauss used it systematically in the early nineteenth century in his study of the orbit of the asteroid Pallas. An even more reduced version of a system, called “row echelon form” or “Gauss–Jordan form,” was first published in a handbook on geodesy written by Wilhelm Jordan. At that time, elimination methods were considered a tool for geodesy instead of a part of mathematics.
Other Historical Developments
The closely related concept of determinants originated during the late seventeenth century simultaneously in work of Seki Kowa, in Japan, and Gottfried Leibniz, in Germany. From the modern point of view, the determinant is a function of a matrix, so it is remarkable that the study of determinants originated more than a century before the study of matrices. A systematic theory of matrices, determinants, and systems of linear equations was developed by European mathematicians during the nineteenth century: the most important contributors were Augustin Cauchy, Arthur Cayley, Ferdinand Eisenstein, Ferdinand Frobenius, Charles Hermite, Edmond Laguerre, and Karl Weierstrass. Thomas Hawkins, a historian of mathematics, who has done much research on these developments, argues that the most important motivation for this development was the Cayley–Hermite problem of determining all linear substitutions of the variables of a quadratic form, which leave the form invariant.
A famous memoir by Cayley introduced the single-letter notation for matrices together with the operations of matrix addition and multiplication and clarifies the relation between matrices and systems of linear equations: “a set of quantities arranged in the form of a square, for example,
is said to be a matrix. The notion of such a matrix arises naturally from an abbreviated notation for a set of linear equations, viz. the equations
X = ax + by + cz
Y = a′x + b′y + c′z
Z = a″x + b″y + c″z.
Matrix Theory
In the twentieth century, Olga Taussky-Todd became what she later referred to as “a torchbearer for matrix theory.” During World War II, she worked on 6-by-6 matrices related to the flutter analysis of aircraft. She used a theorem by Russian mathematician Semyon Aranovich Gershgorin to simplify the amount of calculation and computations. The theory of solving matrix systems continues in the early twenty-first century as numerical analysts search for efficient algorithms. In addition to Taussky-Todd’s own theoretical and applied work in the area, she encouraged others to join in its development. Eventually, partly because of her influence, matrix theory became a true branch of mathematics instead of just a tool for applications.
Contemporary Applications
The theory of matrices is an essential part of linear algebra, which is a highly developed branch of mathematics, with many applications to the natural and social sciences. For example, matrix mechanics, the first definition of quantum mechanics, led to the study of infinitely large matrices. Matrices also represent digital images on a computer, and, in musical set theory, matrices are used to analyze or to create compositions. Matrices containing entries other than numbers and the calculus of matrices have found importance in statistics and engineering. Typical applications discussed in modern linear algebra textbooks are network flow, electrical resistance, chemical reactions, economic models, dynamical systems, vector geometry, computer graphics, least squares approximation, correlation and variance, optimization, and cryptography.
Bibliography
Austin, David. “How Google Finds Your Needle in the Web’s Haystack.” http://www.ams.org/samplings/feature-column/fcarc-pagerank.
Bernstein, Dennis S. Matrix Mathematics: Theory, Facts, and Formulas. 2nd ed. Princeton, NJ: Princeton University Press, 2009.
Grattan-Guinness, Ivor, and Walter Ledermann. “Matrix Theory.” In Companion Encyclopedia of the History and Philosophy of the Mathematical Sciences. Edited by I. Grattan-Guinness. Baltimore, MD: Johns Hopkins University Press, 2003.
Hawkins, Thomas. “Another Look at Cayley and the Theory of Matrices.” Archive internationales d’histoire des sciences 26 (1977).
Katz, Victor. “Historical Ideas in Teaching Linear Algebra.” In Learn from the Masters! Edited by Frank Swetz, John Fauvel, Otto Bekken, Bengt Johansson, and Victor Katz. Washington, DC: Mathematical Association of America, 1995.
MacTutor History of Mathematics Archive. “Matrices and Determinants.” http://www-history.mcs.st-and.ac.uk/HistTopics/Matrices‗and‗determinants.html.
MacTutor History of Mathematics Archive. “Nine Chapters on the Mathematical Art.” http://www-history.mcs.st-and.ac.uk/HistTopics/Nine‗chapters.html.