Computer programs and subroutines
Computer programs are sequences of instructions that enable a computer to perform a variety of tasks, stored in the computer's memory and executed upon command. A key component of these programs is the subroutine, a defined section that carries out a specific function, thereby enhancing the program's modularity and making it easier to manage and debug. Programs can be resident within the computer's hardware or loaded from external storage, and they facilitate numerous activities such as calculations, data manipulation, and user interaction.
For example, a common task within a program might include converting Celsius to Fahrenheit using variables to store various temperature values, allowing users to modify inputs dynamically. Modern computer programs can also incorporate branching, enabling them to make decisions based on user input or predefined conditions. This capability is particularly useful in applications where interactive data entry is required, such as airline reservation systems.
The historical evolution of programming showcases its roots in various fields, transforming from early mechanical devices to sophisticated modern software applications. As technology advances, the integration of simulations, hypermedia, and artificial intelligence is expected to further enhance how users interact with programs, making them increasingly accessible and powerful in various domains, including science, business, and education. Understanding the foundations of computer programming and subroutines is essential for grasping the broader implications of technology in everyday life.
Subject Terms
Computer programs and subroutines
Type of physical science: Computation
Field of study: Computers
The list of instructions that controls the operation of a computer is called a program. The program is generally composed of a number of useful sections called subroutines or procedures, each of which performs a subtask for the overall program.


Overview
A computer program is a sequence of instructions for a computer to perform. The program is stored in a portion of the computer's memory and is not executed until a command is given to do so. A subroutine or procedure is a portion of a program that performs a single task.
The hardware or computer itself may contain some programs that are built in, or "resident" in the microchips. Generally, these programs control the way in which the computer accesses and runs other programs from a disk or other storage media. Some computers, however, are designed to do only one task, and the program that controls that task is built into the computer hardware. A common example is a machine that is referred to as a "word processor," which cannot do tasks other than word processing.
In computers that are designed to be able to do many things, however, only certain programs are built into the hardware. Most of the computer's tasks will be controlled by programs stored on a temporary medium such as a floppy disk and loaded into the computer so that the program resides in memory only while it is being used. The term "software" is used to refer to these programs.
Some common activities that a computer program may instruct the computer to perform include: print on the screen; draw on the screen; make a sound; send or receive a signal through a cable, as when connecting to a modem, printer, or disk drive; ask a question (print on screen) and accept a user's answer from the keyboard or other input device; evaluate conditions to decide whether to do A or B; and make calculations on numbers and manipulate symbols.
The commands of a program are made up of a verb and an optional tag. For example, in one programming language, BASIC, the command to print HELLO on the screen is PRINT "HELLO" (verb, tag), and the command to end the program is END (verb). The order of the commands depends upon the sequence of activities the computer is to perform and upon the programming language. Illustrations that follow use the programming language BASIC, which is a built-in language on most microcomputers.
In BASIC, the commands are numbered conventionally by tens, and the computer performs them in order unless a command is encountered that tells it to go to a new number out of sequence and continue from there. To use a simple example: Line 20 tells the computer to go to line 40 and continue on from there, skipping over line 30. This program will therefore produce output that looks like this: This is not a program that makes particular sense; it simply illustrates that BASIC may send control to a line that is numbered out of sequence.
Computer programs and their subroutines are tools that extend the abilities of the mind.
They can accomplish calculations and other symbol manipulations that are too costly, difficult, or repetitive for humans to perform. To understand how a computer program can do this, it is important to understand how variables work.
A variable is a location in the computer's memory that has a name and has something in it, for example, a word or number. The variable's name may then be used by the programmer to refer to its contents, so that the program will work regardless of the data given to it. For example, suppose one wants to write a program that will perform a calculation and print the answer--in this case, a conversion of Celsius to Fahrenheit. In the BASIC program that follows, C is the name of the variable that contains the Celsius temperature and F is the name of the variable that will contain the corresponding Fahrenheit value. The symbol * is used to represent multiplication.
Here is what each line does: LET C=45 creates a variable named C and puts 45 in it.
LET F=(9/5)*C+32 is the conversion formula that creates a second variable and puts in it a value resulting from the calculation 9/5 times the value of what is in C, which is 45, plus 32.
Line 30 prints out on the screen a series of things on one line: the value of C, the words in quotation marks, the value of F, and the final word in quotes: 45 CELSIUS EQUALS 113 FAHRENHEIT.
Once one has seen a program use variables to make a calculation that one would not easily make oneself, it is possible to begin to appreciate the usefulness of computer programs.
The same program, with one minor modification, can be used to calculate a new Fahrenheit value. Simply change line 10 to a new Celsius value and run the program again. Similarly, when a bridge builder needs to figure the stresses on a bridge that is 2.7 meters longer than the last bridge that was built, the same program for calculating stresses can be used again with new particular data.
Variables are also used when a program is interactive--that is, when it is designed to deal with data the user enters during the operation of the program. One such example is the program that is used to make airline reservations. The programmer cannot know ahead of time what data will be entered and has to plan for variables to be created later when the travel agent makes an inquiry. Following is an oversimplified example using the BASIC command INPUT to create a variable at the time when the user enters the data.
This program segment prints the question on the screen, waits for the travel agent to type in the answer, prints the second question, and waits for that answer. In each case, the variable would have both a name and a value only after the user typed an answer. (The $ in variable DES$ tells the computer to save enough space for a word to be entered; numeric variables do not require this symbol.)
The ability of a program to make decisions based on the values of variables is called branching. The Celsius conversion program would work this way as a branching, interactive program:
This program will say it is very hot if the user gives a Celsius temperature that converts to more than 85 degrees Fahrenheit; then it will branch to the end and stop. If the Fahrenheit temperature is not more than 85 degrees, however, control passes to line 60, which checks to see if it is under 20 degrees. If so, it comments on the cold; if not, it ends.
Subroutines or procedures are tools that the programmer constructs to make a customized, modular program, thus making it more manageable. It is easier to plan and "debug" (fix the mistakes in) a program when it is modularized. The lead programmer may assign a number of people to work on the project, giving each the task of working out the details of how to do a particular subsection.
A subroutine is used to perform repetitive tasks, such as telling the user to press the RETURN key to continue reading, or to carry out a single task, such as putting names in αbetical order. The subroutine is a part of the overall program, but is a clearly defined part with a clear beginning and end. The main part of the program is said to "call" the subroutine, or start that module running, when it needs that particular task to be carried out. A program designed to keep track of people, social security numbers, hours worked, and wages paid would need at least the following procedures: αbetizing, calculating wages, printing out checks, answering queries about work patterns over a period of time, and making changes in any of these lists. Subroutines would be developed by the programmer and clearly defined as sequences of commands. Then, when using this software, a person could choose from a menu which subroutine the computer should carry out: αbetize names; print the names, hours, wages; print checks; show employee X's hours worked for the last three months; and so on.
Applications
Computers are useful for speedy and accurate symbol manipulation, transforming data from one symbol system to another and looking up information in large sets of data. Programs for controlling these processes affect humans throughout their lives, from the census to the library to the automobile. Computer programs, both those that are specifically designed to solve a particular problem and published software that performs fairly standard and common tasks, such as statistical analysis or graphical display of data, are used in all the sciences. Other uses of computer programs in science include desktop publishing of reports, database searching, computer-aided design, simulation of processes to be studied, hypermedia research, scientific visualization, robotics and other artificial intelligence activities, and supercomputing. Focus will be on three somewhat interrelated activities: simulation and visualization, supercomputing, and hypermedia.
Simulation is the modeling of a real event or process, in this case by a computer program; it is used as a means of studying an event or process that is too complex, too dangerous, too large, or too small to study readily otherwise. Population control simulations involve using mathematical formulas in a computer program (recall the formula to convert Celsius to Fahrenheit) to show graphically what will happen as different variables are given particular values. Common simulations of this type involve the study of the variables that influence the size of herds of animals: food supply, weather, and reproduction rate, among other things. After the user enters information about any variable, the computer displays a graph that shows the predicted change in the size of the herd over several years. The interactions of these variables must be understood well enough to be able to specify the formulas in a program.
Simulations are also an aid to the study of complex processes that are not easily managed by the human mind when they are represented as raw data or lists of numbers, but are much more easily grasped when translated into graphics. Much of this type of work is handled by supercomputers, which can deal with billions of numbers. An example is the use of the supercomputer to illustrate the interaction of a gravitational wave with a black hole, work done by Larry Smarr and others at the University of Illinois. Teams of graphic artists, computer programmers, and scientists work together to find the best way to translate numerical data into color, size, shape, and animation. Another example of computers as an aid to visualization is medical simulation, for example, the simulation of a working heart, or the simulation of different disease processes. Using programs such as these, a researcher can easily recognize the variety of appearances of the heart or the differences in symptoms among individuals with the same disease. A final example is plate tectonics, in which researchers use visualization to study factors that might predict earthquakes. Such simulations are based on data provided by past events, in the hope that models eventually will be accurate enough to enable the prediction of future events.
Hypermedia systems are programs that allow the scientist to access information from various media, such as text, sound, video images, and animation. A controlling program allows the user to choose which topic or image to see next (recall the example in which a menu allowed the user to choose what to do next). A scientist might, for example, request a video image of a plant that has been damaged by a particular virus, ask to see the virus that caused the damage, ask what is known about the virus, request recently published documents on treatments, and so forth. Simulations of the growth of the virus and the path of the disease may be requested. The researcher can travel through such a system in any direction, depending on prior knowledge, goals, interest, or hunches. It is a way of having at one's fingertips all that is known about a subject.
Well-planned modular procedures work together to produce programs that enable a user to make choices about entering data, activities for the computer to perform, visuals to see on the screen, and paths to take through the information. By understanding how programs and subroutines work and the concepts of branching and interactivity, one can understand the basic building blocks of these complex systems.
Context
The history of computer programming includes such diverse fields as textiles, mathematics, spying, and business. As early as 1801, the Jacquard loom used computer programs (in the form of a series of punch cards through which the warp was drawn) to control patterns.
Several mathematicians attempted to develop machines that would reduce the tedium of calculation. Blaise Pascal built a calculating machine in 1641 to aid his father, who was a tax collector. In 1833, Charles Babbage, often said to be the inventor of the computer, designed the first general-purpose computer, the analytical engine. This machine could do any of a host of tasks that were indicated in a set of instructions--a computer program. Ada Lovelace worked with Babbage; she was the first computer programmer. Babbage himself never built the machine, but his son did many years later.
Herman Hollerith devised a punch-card system that was used for the 1890 census; it was electrical rather than mechanical. Alan Mathison Turing, an English mathematician, worked to develop machines that could crack the codes used by the Germans in World War II, while Germans worked on machines that were intended to produce uncrackable codes. Turing's work is often credited with helping the Allies win the war.
In the evolution of the modern electronic computer, the computer and programming went through a number of steps: vacuum tubes, electromagnetic relays, silicon chips, and binary and decimal numbers. The first modern computer was the huge Harvard University Mark I produced by IBM, which began to operate during World War II. After the war, higher-level programming languages were devised that used instructions that were more like ordinary words, with the computer itself making the transition to binary code. With the birth of the microcomputer during the mid-1970's, programming began to be used more widely and for different purposes as it branched out into business, education, and home uses.
Modern programming offers considerably more sophistication than the introductory examples given above. For example, object-oriented programming languages allow the programmer to connect objects, such as a label drawn on the screen, to other objects, such as a new screen of information, by simply making choices in menus. Users do not need to know how to program the computer at all, but simply how to read, follow directions, and solve problems.
In the future, the integration of simulation, hypermedia, and artificial intelligence for problem solving will affect all areas of life. Readily available tools will enable the casual user to access information about almost anything simply by asking in ordinary language. People will be able to see video clips from the Library of Congress on their own home computers. Collaborative work will be done at long distances, and the nature of the workplace may change as well. All of these possibilities are controlled by computer programs, which serve as the interface between the user and the computer.
Principal terms
DATABASE: a piece of software that allows the user to define, create, store, and retrieve records on individual cases; a database makes easy the retrieval of data, frequency counts, tables, and the like
HARDWARE: the computer and its peripheral devices, such as monitor, printer, disk drive
HYPERMEDIA: the assemblage of many media (sound, video, text) into one electronic document that the user may traverse in any of a number of directions, depending on the user's interests
MEMORY: the space available for a program and any data that the user enters; microcomputer memory is measured in kilobytes, each of which is 1,024 bytes (roughly equivalent to a single letter or digit)
PROCEDURE: a series of commands that carry out various operations that are the components of a single subtask of the computer program
SIMULATION: the modeling of a real process or event by a computer
SOFTWARE: the computer program
VARIABLE: a location in the memory of the computer; it has both a name and a value associated with it; the value is said to be the contents of that variable
Bibliography
Cox, Donna J. "The Art of Scientific Visualization." ACADEMIC COMPUTING 3 (March, 1990): 20-23. In this article, Cox gives a number of examples of displays that help scientists visualize, particularly the black hole example. This particular issue of this magazine is interesting as a whole, since it contains other articles on scientific visualization. Written for a nontechnical audience that is somewhat familiar with computers.
Evans, Christopher. THE MICRO MILLENNIUM. New York: Simon & Schuster, 1979. This Washington Square Press paperback is a delightful history of computing.
Fraase, Michael. MACINTOSH HYPERMEDIA. 2 vols. Vol. 1, REFERENCE GUIDE, and Vol. 2, USES AND IMPLEMENTATIONS. Glenview, Ill.: Scott, Foresman, 1989-1990. This two-volume set covers a number of hypermedia authoring systems (programs that themselves may be used to help a programmer easily develop a hypermedia document) and a number of applications that have been developed or were under development at the time of the writing.
Horn, Carin E., and James L. Poirot. COMPUTER LITERACY: PROBLEM-SOLVING WITH COMPUTERS. 2d ed. Austin, Tex.: Sterling Swift, 1985. An excellent resource for the general reader, this book includes history, applications in government, occupations, societal issues, and a brief introduction to programming in two languages.
Luehrmann, Arthur, and Herbert Peckham. COMPUTER LITERACY: A HANDS-ON APPROACH. New York: McGraw-Hill, 1983. This popular high school introductory textbook is excellent in its treatment of the introduction of BASIC programming in a context in which students examine subroutines, structured programming, and the societal context of programming.
McCorduck, Pamela. MACHINES WHO THINK. San Francisco: W. H. Freeman, 1979. A classic, this paperback is a widely read introduction to the history, purposes, and societal context of the field of artificial intelligence. Geared for the nontechnical, literary reader.
Roszak, Theodore. THE CULT OF INFORMATION: THE FOLKLORE OF COMPUTERS AND THE TRUE ART OF THINKING. New York: Pantheon, 1986. A social history of modern computing and a critique of the enterprise as it is conducted by those who have questionable moral purposes. Very readable.
Schank, Roger. THE COGNITIVE COMPUTER: ON LANGUAGE, LEARNING, AND ARTIFICIAL INTELLIGENCE. Reading, Mass.: Addison-Wesley, 1984. Gives the reader a sense of the kinds of issues and problems involved in getting a computer to understand a natural language. An excellent source for the interested reader who wants to understand the issues in the field of artificial intelligence and wants to assess the likelihood of its success.
BASIC commands are executed in line-number order
Output of a simple BASIC program
BASIC program to convert Celsius to Fahrenheit
BASIC command INPUT creates a variable
BASIC branching, interactive program
Programming Languages for Artificial Intelligence