Vectors (mathematics and physics)
Vectors in mathematics and physics are quantities that possess both magnitude and direction, distinguishing them from scalars, which only have magnitude. For example, while stating a time of 6 a.m. suffices, describing a velocity of five meters per second requires directional information to be meaningful. Vectors are foundational in various applications, from navigation in aviation to modeling dynamics in sports and physics. Historically, the concept of vectors evolved from early recognition by figures like Aristotle to rigorous formalizations in the 19th century, leading to the development of vector calculus.
Mathematically, a vector is represented as a directed line segment or an array, and operations such as addition and scalar multiplication follow specific rules, including the Triangle Law and properties like commutativity and associativity. Vectors are crucial in fields such as fluid dynamics, electromagnetism, and mechanics, where they describe forces, velocity, and momentum. Additionally, the study of vector fields, which represent the influence of forces across a region, is vital in understanding electric and magnetic phenomena. Ultimately, the principles of vectors and their calculus continue to be integral to both theoretical and applied sciences.
Subject Terms
Vectors (mathematics and physics)
Summary: Vectors express magnitude and direction, and have applications in physics and many other areas.
There are some quantities, like time and work, that have only a magnitude (also called “scalars”). If one says the time is 6 a.m., it is adequate. When discussing velocity or force, however, then magnitude is not enough. If a particle has a velocity of five meters per second, this is not sufficient information because the direction of movement is unknown. Quantities that require both a magnitude and a sense of direction for their complete specifying are called “vectors.” Pilots use vectors to compensate for wind to navigate airplanes, sport analysts use vectors to model dynamics, and physicists use vectors to model the world.
![Vector in a cartesian coordinate system. a = ax + ay + az By User:Acdx (Self-made, based on Image:Spatial vector.png) [GFDL (http://www.gnu.org/copyleft/fdl.html) or CC-BY-SA-3.0-2.5-2.0-1.0 (http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons 94982096-91653.jpg](https://imageserver.ebscohost.com/img/embimages/ers/sp/embedded/94982096-91653.jpg?ephost1=dGJyMNHX8kSepq84xNvgOLCmsE2epq5Srqa4SK6WxWXS)
![Torque - vectors By Masur (Own work) [Public domain], via Wikimedia Commons 94982096-91654.jpg](https://imageserver.ebscohost.com/img/embimages/ers/sp/embedded/94982096-91654.jpg?ephost1=dGJyMNHX8kSepq84xNvgOLCmsE2epq5Srqa4SK6WxWXS)
History and Development of Vectors
The term “vector” originates from vectus, a Latin word meaning “to carry.” However, astronomy and physical applications motivated the concept of a vector as a magnitude and direction. Aristotle recognized force as a vector. Some historians question whether the parallel law for the vector addition of forces was also known to Aristotle, although they agree that Galileo stated it explicitly and it appears in the 1687 work Principia Mathematica by Isaac Newton. Aside from the physical applications, vectors were useful in planar and spherical trigonometry and geometry. Vector properties and sums continue to be taught in high schools in the twenty-first century.
The rigorous development of vectors into the field of vector calculus in the nineteenth century resulted in a debate over methods and approaches. The algebra of vectors was created by Hermann Grassmann and William Hamilton. Grassmann expanded the concept of a vector to an arbitrary number of dimensions in his book The Calculus of Extension, while Hamilton applied vector methods to problems in mechanics and geometry using the concept of a “quaternion.” Hamilton spent the rest of his life advocating for quaternions. James Maxwell published his Treatise on Electricity and Magnetism in which he emphasized the importance of quaternions as mathematical methods of thinking, while at the same time critiquing them and discouraging scientists from using them. Extending Grassman’s ideas, Josiah Gibbs laid the foundations of vector analysis and created a system that was more easily applied to physics than Hamilton’s quaternions. Oliver Heaviside independently created a vector analysis and advocated for vector methods and vector calculus. Mathematicians such as Peter Tait, who preferred quaternions, rejected the methods of Gibbs and Heaviside. However, their methods were eventually accepted and they are taught as part of the field of linear algebra. The quaternionic method of Hamilton remains extremely useful in the twenty-first century. Vector calculus is fundamental in understanding fluid dynamics, solid mechanics, electromagnetism, and in many other applications.
During the nineteenth century, mathematicians and physicists also developed the three fundamental theorems of vector calculus, often referred to in the twenty-first century as the “divergence theorem,” “Green’s theorem,” and “Stokes’s theorem.” Mathematicians with diverse motivations all contributed to the development of the divergence theorem. Michael Ostrogradsky studied the theory of heat, Simeon Poisson studied elastic bodies, Frederic Sarrus studied floating bodies, George Green studied electricity and magnetism, and Carl Friedrich Gauss studied magnetic attraction. The theorem is sometimes referred to as “Gauss’s theorem.” George Green, Augustin Cauchy, and Bernhard Riemann all contributed to Green’s theorem, and Peter Tait and James Maxwell created vector versions of Stokes's theorem, which was originally explored by George Stokes, Lord Kelvin, and Hermann Hankel. Undergraduate college students often explore these theorems in a multivariable calculus class.
The concept of a space consisting of a collection of vectors, called a “vector space,” became important in the twentieth century. The notion was axiomatized earlier by Jean-Gaston Darboux and defined by Giuseppe Peano, but their work was not appreciated at the time. However, the concept was rediscovered and became important in functional analysis because of the work by Stefan Banach, Hans Hahn, and Norbert Wiener, as well as in ring theory because of the work of Emmy Noether. Vector spaces and their algebraic properties are regularly taught as a part of undergraduate linear algebra.
Mathematics
A vector is defined as a quantity with magnitude and direction. It is represented as a directed line segment with the length proportional to the magnitude and the direction being that of the vector. If represented as an array, it is often represented as a row or column matrix. Vectors are usually represented as boldface capital letters, like A or with an arrow overhead: A⃗.
The Triangle Law states that while adding, “if two vectors can be represented as the two sides of a triangle taken in order then the resultant is represented as the closing side of the triangle taken in the opposite order” (see Figure 1).
Any vector can be split up into components, meaning to divide it into parts having directions along the coordinate axes. When added, these components return the original vector. This process is called “resolution into components” (see Figure 2). Clearly, this resolution cannot be unique as it depends on the choice of coordinate axes. However, for a given vector and specified coordinate axes, the resolution is unique. When two vectors are added or subtracted, these components along a specific axis simply “add up” (like 2+2=4 or 7-2=5) but the original vectors do not, which follow the rule of vector addition that can be obtained by the Parallelogram Law of Vector Addition. Vector addition is commutative and associative in nature.
Multiplication for vectors can be of a few types:
- 1. For scalar multiplication (multiplication by a quantity that is not a vector), each component is multiplied by that scalar. Vector multiplication by a scalar is commutative, associative, and distributive in nature.
- 2. For the multiplication of two vectors, one can obtain both a scalar (dot product) or a vector (cross product). For a cross product the resultant lies in a plane perpendicular to the plane containing the two original vectors. Dot product is both commutative and distributive. But cross product is neither commutative nor associative in nature because the result is a vector and depends on the direction.
Applications
Theoretical sciences have a wide spread of applications of vectors in nearly all fields:
- Obtaining components: Occasionally, one needs a part (or component) of a vector for a given purpose. For example, suppose a rower intends to cross over to a point on the other side of a river that has a great current. The rower would be interested to know if any part of that current could help in any way to move in the desired direction. To find the component of the current’s vector along any specified direction, take the dot product of that vector with a unit vector (vector of unit magnitude) along the specified direction. This method is of particular importance in studying of particle dynamics and force equilibria.
- Evaluating volume, surface, and line integrals: In many problems of physics, it is often necessary to shift from either closed surface integral (over a closed surface that surrounds a volume) to volume integral (over the whole enclosed volume), or from closed line integral (over a loop) to surface integrals (over a surface). To accomplish these shifts, it is often very useful to apply two fundamental theorems of vector calculus, namely Gauss’s divergence theorem and Stokes’s theorem, respectively.
- Particle mechanics: In the study of particle mechanics, vectors are used extensively. Velocity, acceleration, force, momentum, and torque all being vectors, a proper study of mechanics invariably involves extensive applications of vectors.
- Vector fields: A field is a region over which the effect or influence of a force or system is felt. In physics, it is very common to study electric and magnetic fields, which apply vectors and vectorial techniques in their description.
Bibliography
Katz, Victor. “The History of Stokes’ Theorem.” Mathematics Magazine 52, no. 3 (1979).
Matthews, Paul. Vector Calculus. Berlin: Springer, 1998.
Stroud, K. A. , and Dexter Booth. Vector Analysis. New York: Industrial Press, 2005.