Structured Programming
Structured programming is a programming paradigm that emphasizes a disciplined approach to software development, focusing on managing complexity and improving code quality. It involves two key aspects: the structured management of software throughout its life cycle and the systematic writing of computer programs. A well-defined process, guided by the Information System Development Life Cycle (ISDLC), ensures that projects progress through distinct phases—from definition and construction to implementation and maintenance.
One of the prominent techniques used in structured programming is Function Point Analysis (FPA), which measures software complexity by evaluating data exchanges and project outputs from an end-user perspective. This helps in assessing productivity and quality, as higher function points indicate more complex systems. Additionally, structured programming advocates for modular design, where programs are divided into manageable components or modules. This approach minimizes interdependencies between modules, reducing the risk of errors and making maintenance more efficient. Overall, structured programming aims to enhance the reliability, maintainability, and efficiency of software systems, reflecting a balanced integration of technology, people, and processes in software management.
On this Page
- Business Information Systems > Structured Programming
- Overview
- Structured Design & Simplicity
- Modules
- Modules, Connections & Complexity
- Determining Functional Performance of Modules
- Overhead for Module Writing
- Applications
- Measuring the Complexity of Software
- Issue
- Improving Quality & Productivity in Software Development
- Conclusion
- Terms & Concepts
- Bibliography
- Suggested Reading
Subject Terms
Structured Programming
This article explains the concept of structured programming and examines some of the key aspects of the structured programming process. The impact of the information systems development life cycle on structured programming is also examined. The use of Function Point Analysis as a tool to measure the complexity of a computer program is explained along with the basic process of applying function point analysis. The issue of improving quality in the software development is examined and various methods that can be employed to improve software quality are investigated.
Keywords Function Point Analysis; Information systems development life cycle (ISDLC); Management Information Systems; Requirements analysis; Software Development Productive; Structured Development; Structured Programming
Business Information Systems > Structured Programming
Overview
There are two major aspects of structured software development. The first is the overall method of managing software over its life cycle. The second is the style and method in which computer programs are actually written. Software consists of abstract sets of rules that govern the creation, transfer, and transformation of data. Initially existing solely as an idea, software is iteratively refined becoming completely visible only at its completion. This invisibility is compounded for large software projects, for which logical complexity cannot be maintained in one person's mind, and for which development must be partitioned into a number of tasks that are assigned to different programmers. As task descriptions are only models of the intended abstraction, and as the individual performing a task interprets these descriptions through a unique worldview, most software errors occur at the interfaces of modules written by different programmers (Zelkowitz, 1978).
How software is developed and managed over its lifecycle varies considerably from organization to organization. In large organizations that use numerous and varied applications software packages, a disciplined and structured approach to software management is generally followed. The Information System Development Life Cycle (ISDLC) is an established concept in the MIS arena. The traditional approach to the ISDLC is that a development project has to undergo a series of phases where the completion of each is a prerequisite to the commencement of the next and where each phase consists of a related group of steps. The general scheme for the ISDLC is similar almost everywhere. It typically contains four major phases consisting of several steps each:
- The Definition Phase: Consisting of preliminary analysis, feasibility study, information analysis, and system design.
- The Construction Phase: Consisting of programming, development of procedures, unit testing, quality control, and documentation.
- The Implementation Phase: Consisting of user training, conversion of old systems to new systems, thorough field testing, and then a move to full operations.
- The Maintenance Phase: After the system is in full operation, updates are made to assure continued operations as new equipment or upgrades to operating systems occur. Enhancements to the system can also be made to meet changing user requirements.
The traditional approach to software management advocates a rigid ISDLC in order to assure control over the development process. In practice, however, development processes are not that rigid. They vary with respect to the complexity of the system under development, the importance attached to that system, and the user's environment. (Ahituv, Hadass & Neumann, 1984). The various steps of the ISDLC are usually perfromed on all projects, but not necssarily in traditional order. For example, the testing, quality control, and documentation steps may not occur until everybody involved is satisifed with data models or with prototypes of systems.
Structured Design & Simplicity
Structured design is a set of general program design considerations and techniques for making coding, debugging, and modification easier, faster, and less expensive by reducing complexity. The extent to which structured programming methods are followed varies from organization to organization. In general, the more complex a system is the more likely it is that structured design and programming methods will be applied to the development process.
Simplicity is the primary method for evaluating alternative designs to reduce debugging and modification time. Simplicity can be enhanced by dividing the applications software into separate pieces in such a way that pieces can be considered, implemented, fixed, and changed with minimal consideration or effect on the other pieces of the software. Observability (the ability to easily perceive how and why actions occur) is another useful consideration that can help in designing programs that can be changed easily. Consideration of the effect of reasonable changes is also valuable for evaluating alternative designs.
Modules
Structured programming and simplicity guidelines call for developing software in modules. The term module is used to refer to a set of one or more contiguous program statements having a name by which other parts of the system can invoke it and preferably having its own distinct set of variable names. Examples of modules are PL/I procedures, FORTRAN mainlines and subprograms, and, in general, subroutines of all types. Considerations are always with relation to the program statements as coded, since it is the programmer's ability to understand and change the source program that is under consideration.
While conceptually it is useful to discuss dividing whole programs into smaller pieces, the techniques for designing new original independent modules are simple. On the other hand, it may be difficult to divide an existing program into separate pieces without increasing the complexity because of the amount of overlapped code and other interrelationships that usually exist.
Modules, Connections & Complexity
The fewer and simpler the connections between modules, the easier it is to understand each module without reference to other modules. Minimizing connections between modules also minimizes the paths along which changes and errors can propagate into other parts of the system. This helps to eliminate disastrous "ripple" effects, where changes in one part cause errors in another, necessitating additional changes elsewhere that often give rise to new errors. The widely used technique of using common data areas (or global variables or modules without their own distinct set of variable names) can often result in an enormous number of connections between the modules of a program.
The complexity of a system is affected not only by the number of connections but by the degree to which each connection couples (associates) two modules, making them interdependent rather than independent. Coupling is the measure of the strength of association established by a connection from one module to another. Strong coupling complicates a system since a module is harder to understand, change, or correct by itself if it is highly interrelated with other modules. Structure can be improved and complexity reduced by designing systems with the weakest possible coupling between modules.
The degree of coupling established by a particular connection is a function of several factors, and thus it is difficult to establish a simple index of coupling. Coupling depends on how complicated the connection is, on whether the connection refers to the module itself or something inside it, and on what is being sent or received.
Coupling increases with complexity or obscurity of the interface. Coupling is lower when the connection is with the normal module interface than when the connection is with an internal component. Coupling is lower with data connections than with control connections, which are in turn lower than hybrid connections (modification of one module's code by another module).
Every element in the common environment, whether used by particular modules or not, constitutes a separate path along which errors and changes can propagate. Each element in the common environment adds to the complexity of the total system. Changes to, and new uses of, the common area can potentially impact all modules in unpredictable ways. Data references may become unplanned, uncontrolled, and even unknown.
A module interfacing with a common environment for some of its input or output data is, on average, more difficult to use. It is somewhat clumsier to establish a new and unique data context on each call of a module when data passage is via a common environment. Without analysis of the entire set of shared modules or careful saving and restoration of values, a new use is likely to interfere with other uses of the common environment and could propagate errors into other modules. As to future growth of a given system, once the commitment is made to communicate via a common environment, any new module will have to be plugged into the common environment, compounding the total complexity even more.
The complexity of an interface is a matter of how much information is needed to state or to understand the connection. Thus, obvious relationships result in lower coupling than obscure or inferred ones. The more syntactic units (such as parameters) in the statement of a connection, the higher the coupling. Thus, extraneous elements irrelevant to the programmer's and the modules' immediate task can unnecessarily increase coupling.
Determining Functional Performance of Modules
A useful technique in determining whether a module is functionally bound is writing a sentence describing the function (purpose) of the module, and then examining the sentence. The following tests can be made:
- If the sentence has to be a compound sentence, contain a comma, or contain more than one verb, the module is probably performing more than one function; therefore, it probably has sequential or communicational binding.
- If the sentence contains words relating to time, such as "first," "next," "then," "after," "when," "start," etc., then the module probably has sequential or temporal binding.
- If the predicate of the sentence doesn't contain a single specific object following the verb, the module is probably logically bound. For example, Edit All Data has logical binding; Edit Source Statement may have functional binding.
Words such as "initialize," "clean-up," etc. imply temporal binding. Functionally bound modules can always be described by way of their elements using a compound sentence. But if the above language is unavoidable while still completely describing the module's function, then the module is probably not functionally bound.
A predictable, or well-behaved, module is one that, when given the identical inputs, operates identically each time it is called. Also, a well-behaved module operates independently of its environment.
Overhead for Module Writing
The overhead involved in writing many simple modules is in the execution time and memory space used by a particular language to effect the call. The designer should realize the adverse effect on maintenance and debugging that may result from striving just for minimum execution time and/or memory. Designers should also remember that programmer cost, is, or is rapidly becoming, the major cost of a programming system and that much of the maintenance will be in the future when the trend will be even more prominent. However, depending on the actual overhead of the language being used, it is very possible that a structured design can result in less execution and/or memory overhead rather than more due to the following considerations:
Size can be used as a signal to look for potential problems. Programmers should carefully examine all modules with less than five or more than 100 executable source statements. Modules with a small number of statements may not perform an entire function, hence, may not have functional binding. Very small modules can be eliminated by placing their statements in the calling modules. Large modules may include more than one function. A second problem with large modules is understandability and readability. There is evidence to the fact that a group of about 30 statements is the upper limit of what can be mastered on the first reading of a module listing.
Structured design reduces the effort needed to fix and modify programs. If all programs were written in a form where there was one module, for example, which retrieved a record from the master file given the key, then changing operating systems, file access techniques, file blocking, or I/O devices would be greatly simplified. And if all programs in the installation were retrieved from a given file with the same module, then one properly rewritten module would have all the installation's programs working with the new constraints for that file (Stevens, Myers & Constantine, 1999).
Applications
Measuring the Complexity of Software
One of the most popular methods to measure the complexity of a program is to assess the number of function points in the software being developed or modified. The IS analyst and designers determine the functional needs of applications software through the requirements analysis process. This process involves meetings and discussion with those people in an organization who will actually be using the software.
Function Point Analysis (FPA) was proposed by IBM's Allen J. Albrecht in 1979 and was later revised in 1983. It sizes an application system from the end-user's perspective by identifying data exchanges between users and the software application and those between the software application with other applications. The productivity of a software project is usually measured by the ratio of function points (FPs) delivered to programming hours and months.
An FP is a measure of work product. Each FP can also be seen as a unit of measurement of the complexity of an application or an end-user business function. It is determined by the components relating to information processing?input, output, inquiry, file, interface file and the general application characteristics. There are three types of FPs — development FPs, enhancement FPs and support FPs.
The value of an FP is influenced by the number of file types (FTR), record types (RET) and data element types (DET) referenced by an application. Hence, the more file types, record types and data types an application uses, the more FPs the application delivers.
Computation of FPs consists of several steps. The first step consists of determining the external boundary for the application. This is followed by identifying the major data and transactional function types from the perspectives of the users. These include external input types (IT), external output types (OT), logical internal file types (FT), external interface file types (EI) and external inquiry types (QT).
Next, for each of the function types, the complexity level of information processing is determined by the number of FTRs, RETs and DETs the application refers to. FPs can be used as a measure of output from an IS department. It measures productivity by computing output (in FPs) over input (in man-days or dollars). The quality of an information system can be measured by the number of defects per 1,000 FPs (Bock & Klepper, 1992).
Issue
Improving Quality & Productivity in Software Development
The widespread adoption of information technology (IT) over the last few decades has helped organizations reap numerous operational and strategic benefits. Consumers have also benefited from IT as it reduces market frictions caused by geographical separation, price opacity, and information latency. It is widely recognized that improving software development productivity requires a balanced approach towards the three pillars of software management: Technology, people, and process.
Due to the human-centric nature of developing software, however, the benefits of technological improvements cannot be fully realized without a capable work force. One way of increasing productivity is to refine the development process. The benefits of process improvement are not limited to accelerating the development work but also to reduce the effort spent on corrective activities.
Software development organizations are now performing module integration more frequently; often on a daily basis. With advanced development and project management tools, it is now possible to obtain system fault data and other related metrics on a near continuous basis (Chiang & Mookerjee, 2004). Process improvement in the software realm focus largely on the software quality dimensions of portability, reliability, efficiency, human engineering, and maintainability.
Organizational commitment to skill development, quality policy and goals, and quality-oriented reward schemes are critical aspects of an organizational system for quality. Together, these factors represent what is called the management infrastructure for quality. IS units that have adopted these practices have a sophisticated management infrastructure and hence are better prepared to redesign, formalize, manage, and continuously improve core design and development processes.
Conversely, IS units that have not adopted these practices have a less sophisticated management infrastructure and hence may lack the capability to effectively implement process level improvements that lead to quality outcomes. Thus, management infrastructure sophistication is the second antecedent of quality performance (Ravichandran & Rai, 2000).
Conclusion
There are two major aspects of structured software development. The first is the overall method of managing software over its life cycle. The second is the style and method in which computer programs are actually written. How software is developed and managed over its lifecycle varies considerably from organization to organization. In large organizations that use numerous and varied applications software packages, a disciplined and structured approach to software management is generally followed. The Information System Development Life Cycle guides software development and maintenance processes.
Structured design is a set of general program design considerations and techniques for making coding, debugging, and modification easier, faster, and less expensive. The process involves programming in manageable and definable modules. The term module is used to refer to a set of one or more contiguous program statements having a name by which other parts of the system can invoke it and preferably having its own distinct set of variable names. The extent to which structured programming methods are followed also varies from organization to organization. In general, the more complex a system is, the more likely it is that structured design and programming methods will be applied to the development process.
One of the most popular methods to measure the complexity of a computer program is to assess the number of function points in the application software being developed or modified. Function Point Analysis sizes an application system from the end-user's perspective by identifying data exchanges between users and the software application and those between the software application with other applications. The productivity of a software project is usually measured by the ratio of function points delivered to programming hours and months.
It is widely recognized that improving software development productivity requires a balanced approach toward the three pillars of software management: Technology, people, and process. The benefits of process improvement are not limited to the acceleration of development work, but also reduce the effort spent on corrective activities. Organizational commitment to skill development, quality policy and goals, and quality-oriented reward schemes are critical aspects of an organizational system for quality. Together, these factors represent what is called the management infrastructure for quality. IS units that have adopted these practices have a sophisticated management infrastructure and hence are better prepared to redesign, formalize, manage, and continuously improve core design and development processes.
Terms & Concepts
Coupling: The measure of the strength of association established by a connection from one module to another.
Function Point: Related pieces of code that enable an IS to perform a function.
Function Point Analysis: A Method to size an application system from the end-user's perspective by identifying data exchanges between users and the software application and those between the software application with other applications. The productivity of a software project is usually measured by the ratio of function points (FPs) delivered to programming hours and months.
Information System Development Life Cycle (ISDLC): The multi step structured process in which an information system is developed and maintained.
Module: A set of one or more contiguous program statements having a name by which other parts of the system can invoke it and preferably having its own distinct set of variable names.
Requirements Analysis Process: The process through which applications software developers work with end users to determine what a proposed system should do and how it should accomplish the various tasks.
Bibliography
Ahituv, N., Hadass, M., & Neumann, S. (1984). A flexible approach to information system development. MIS Quarterly, 8, 69-78. Retrieved July 30, 2007, from EBSCO Online Database Academic Search Premier. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=4679361&site=ehost-live
Ahituv, N., Hadass, M., & Neumann, S. (1984). A flexible approach to information system development. MIS Quarterly, 8, 69-78. Retrieved October 31, 2013, from EBSCO Online Database Business Source Complete. http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=4679361&site=ehost-live
Bok, H., & Raman, K. (2000). Software engineering productivity measurement using function points: A case study. Journal of Information Technology (Routledge, Ltd.), 15, 79-90. Retrieved August 5, 2007, from EBSCO Online Database Academic Search Premier. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=3869943&site=ehost-live
Chiang, I., & Mookerjee, V. (2004). Improving software team productivity. Communications of the ACM, 47, 89-93. Retrieved August 5, 2007, from EBSCO Online Database Academic Search Premier. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=18515738&site=ehost-live
Ravichandran, T., & Rai, A. (2000). Quality management in systems development: An organizational system perspective. MIS Quarterly, 24, 381-415. Retrieved August 6, 2007, from EBSCO Online Database Academic Search Premier. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=3617674&site=ehost-live
Stevens, S., Myers, G., & Constantine, L. (1999). Structured design. IBM Systems Journal, 38(2/3), 231. Retrieved August 1, 2007, from EBSCO Online Database Academic Search Premier. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=2001849&site=ehost-live
Zelkowitz, M. (1978). Perspectives on software engineering. ACM Computing Surveys, 10, 197-216. Retrieved August 2, 2007, from EBSCO Online Database Academic Search Premier. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=12143915&site=ehost-live
Suggested Reading
Avison, D., & Taylor, V. (1997). Information systems development methodologies: A classification according to problem situation. Journal of Information Technology (Routledge, Ltd.), 12, 73-81. Retrieved July 23, 2007, from EBSCO Online Database Academic Search Premier. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=6270862&site=ehost-live
Connell, J.L. & Schafer, L.B. (1989). Structured rapid prototyping. Englewood Cliffs, NJ: Yourdon Press.
Hughes, J., & Wood-Harper, T. (1999). Systems development as a research act. Journal of Information Technology (Routledge, Ltd.), 14, 83-94. Retrieved July 23, 2007, from EBSCO Online Database Academic Search Premier. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=3869907&site=ehost-live
Procaccino, D., & Verner, J. (2006). Defining and contributing to software development success. Communications of the ACM, 49, 79-83. Retrieved July 24, 2007, from EBSCO Online Database Academic Search Premier. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=21638985&site=ehost-live
Sakthivel, S. (1991). A survey of requirements verification techniques. Journal of Information Technology (Routledge, Ltd.), 6, 68-79. Retrieved July 23, 2007, from EBSCO Online Database Academic Search Premier. http://search.ebscohost.com/login.aspx?direct=true&db=aph&AN=5418120&site=ehost-live
Shustek, L. (2008). Donald Knuth: A life's work interrupted. Communications of the ACM, 51, 31-35. Retrieved October 31, 2013, from EBSCO Online Database Business Source Complete. http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=33662978&site=ehost-live