Computation

At the most fundamental level, computation is the act of using a mathematical operation—such as addition, subtraction, multiplication, or division—to solve a problem or interpret a set of data. Different methods of computation are classified as either natural or mechanical. Throughout history, humans have developed a seemingly endless array of mechanical devices, from simple abacuses to powerful supercomputers, designed to carry out computations of varying complexity.

Functions of Computation

Computation is a multifaceted concept that has many real-world applications. Most often, computation is thought of in terms of its role in mathematics. Mathematical computation refers to the act of finding numerical values through various mathematical operations. This kind of computation encompasses tasks that range from solving simple addition and subtraction problems to working through complicated algebra and calculus equations.

Computation is also tied to the tasks of data processing and simulation. Data processing, as it relates to computation, is the organization of large volumes of information, usually in connection with scientific research. When used in this context, computation provides scientists with a means of understanding the data they gather, and it allows them to synthesize a result from that data. Simulation refers to the generation of data that is meant to demonstrate the function of a larger process. The computations involved in simulation help to illustrate how complex things, like economic systems, work. However it is utilized, computation is an indispensable tool that allows people to understand the world in which they live.

Types of Computation

Many methods of computation exist. In general, these methods are classified according to the medium used during computation. The two main classifications of computation are natural computation and mechanical computation.

Natural computation refers to the type of mental computation that occurs within an organic system, such as the human brain. In its own way, the human brain is a computational machine powered by an enormous network of neurons and special molecules called receptors and effectors. Receptors receive critical data from the external world, and effectors control bodily functions. As a result of this arrangement, humans are naturally equipped to carry out a diverse array of mathematical and other forms of computation, either mentally or by working out the solution to a problem using pen and paper. Although it is obviously limited in varying degrees from individual to individual, natural computation is essentially the basis of all forms of computation and is, therefore, of vital importance.

Mechanical computation is any computation that is completed by or with the assistance of some sort of mechanical device. The kinds of devices used for this purpose can vary widely. The most basic computational devices include abacuses and slide rules. More advanced devices include common pocket calculators and personal computers. All of these mechanical devices are designed to make it easier for humans to perform computations, particularly those that would be difficult, if not impossible, to complete mentally. As it relates to mathematical computation, mechanical computation is typically characterized as either digital or analog. Digital computation refers to a computational system in which numbers are viewed and used as symbolic entities. In analog computation, numbers are viewed as the measure of physical aspects of a broader system. To use the simplest examples, an abacus is a form of digital computation, and a slide rule is a form of analog computation.

Computers

Computers are the most advanced form of computational device developed by human innovators. The history of the computer as we know it today dates back to the early nineteenth century, when British inventor Charles Babbage twice tried to create what was then called a calculating machine. Although neither of the machines he attempted to build were ever completed, the design he established was the earliest forerunner of the modern computer.

The first calculating machine to actually be completed was designed and built by American inventor Herman Hollerith in 1890. His tabulating machine was intended to help speed up the time-consuming job of processing census data. The machine, which was an immediate success, used punched cards to record information. Hollerith eventually formed his own company, which later became known as the International Business Machine Corporation (IBM).

In 1944, IBM turned its attention to digital computers, installing the first such device, known as the Harvard Mark I computer, at Harvard University. Unlike its predecessors, the Mark I could complete computations entirely on its own without the need for human manipulation. Later that decade, the Electronic Numerical Integrator and Computer (ENIAC) made its debut. The ENIAC was a general purpose computer capable of completing up to five thousand addition operations in a single second. Shortly after the ENIAC appeared, it was followed by the Electronic Discrete Variable Automatic Computer (EDVAC), which was one of the first electronic computers.

Over time, hardware advancements, particularly the transistor and the microchip, led to smaller computers that could be used for a wide variety of tasks. In the late twentieth century, this trend gave rise to the modern personal computer and the countless other electronic computational devices available today.

Bibliography

Ashenhurst, Robert L. "Computation." International Encyclopedia of the Social Sciences. Ed. David L. Sills. Vol. 3. New York: Macmillan, 1968, 192–201. Print.

Collins, John W., III, ed. "Computation." The Greenwood Dictionary of Education. Santa Barbara, CA: Greenwood, 2011, 92. Print.

Hoffman, A. "Artificial and Natural Computation." International Encyclopedia of the Social & Behavioral Sciences. Eds. Neil J. Smelser and Paul B. Baltes. 2nd ed. Vol. 2. Amsterdam: Elsevier, 2001, 777–783. Print.

"Introduction of Theory of Computation." Geeks for Geeks, 27 Sept. 2024, www.geeksforgeeks.org/introduction-of-theory-of-computation/#. Accessed 21 Nov. 2024.

Morimoto, Bertha Kugelman. "Babbage, Charles." Computer Sciences. Ed. Roger R. Flynn. Vol. 1: Foundations: Ideas and People. New York: Macmillan Reference USA, 2002, 24–26. Print.

Plotkin, Robert. "Early Computers: Automating Computation." Computers in Science and Mathematics: Computers, Internet, and Society. New York: Facts on File, 2012, 17–33. Print.