Computer hardware and peripherals industry

Industry Snapshot

GENERAL INDUSTRY: Manufacturing

CAREER CLUSTERS: Manufacturing; Science, Technology, Engineering, and Math

SUBCATEGORY INDUSTRIES: Computer Display Manufacturing; Computer Printer Manufacturing; Computer Storage Device Manufacturing; Computer Workstation Manufacturing; Integrated Circuit Manufacturing; Mainframe Computer Manufacturing; Personal Computer Manufacturing; Semiconductor and Other Electronic Component Manufacturing

RELATED INDUSTRIES: Computer Software Industry; Computer Systems Industry; Internet and Cyber Communications Industry; Retail Trade and Service Industry; Telecommunications Equipment Industry; Video, Computer, and Virtual Reality Games Industry

ANNUAL DOMESTIC REVENUES: US$64.6 billion (semiconductor and circuit manufacturing; IBISWorld, 2024); US$60.7 billion (circuit board and electronic component manufacturing; IBISWorld, 2023); US$11.5 billion (computer peripherals manufacturing; IBISWorld, 2023); US$9.7 billion (computer manufacturing; IBISWorld, 2023)

ANNUAL GLOBAL REVENUES: US$274.1 billion (IBISWorld, 2023)

NAICS NUMBERS: 3341, 33441

Summary

The computer hardware and peripherals industry affects nearly every aspect of modern life, from websites and e-commerce applications to medicine and telecommunications. The industry manufactures computers and their peripherals. (A peripheral is any computer part or device other than the central processing unit, or CPU, and the working memory. Common peripherals include printers, external storage devices, keyboards, and mice.) Companies active in the computer hardware and peripherals industry range from small start-ups offering just a few niche products to international corporations, such as International Business Machines (IBM) and Hewlett-Packard (HP), that have manufacturing and engineering operations across the globe and employ tens of thousands of people.

89088139-78766.jpg

History of the Industry

The earliest counting device was the abacus, a tablet of stone, wood, or metal upon which stones were moved along grooves or painted lines. These were used perhaps as early as 3000 BCE by ancient civilizations such as the Sumerians and the Babylonians. (Later versions consisted of framed devices using freely moving beads.) Similar counting tables were later created to simplify such specialized and diverse fields as navigation, astronomy, and trigonometry. By the nineteenth century, the term "computer" referred to a person who performed complex calculations for a living using mechanical devices. The term applied particularly to astronomers and surveyors.

In 1821, English inventor Charles Babbage grew frustrated with the inaccuracies in the mathematical tables then available. He began experimenting with automatic calculating machines. In 1834, Babbage designed the analytical engine, the first true forerunner to modern computing systems. Babbage's device was to be steam-powered and capable of completing complex computations that went beyond numbers. Supporter Ada Lovelace wrote an article about the analytical engine in 1843 that foreshadowed the era of modern computing. She imagined utilizing numbers to represent letters of the alphabet, allowing the analytical engine to store many different types of data. However, a working version of the ambitious project never materialized.

Just a few decades later, United States Census Bureau employee Herman Hollerith developed the punch-card system, which was used by the bureau to calculate population data as early as 1890. Hollerith further refined his punch-card system after 1896, when he founded the Tabulating Machine Company, the precursor to IBM. Punch-card technology formed the basis for many early computer models.

In 1937, physicists John Vincent Atanasoff and Clifford Berry created the first digital electronic computer at Iowa State University. This breakthrough was soon followed by a rapid expansion in technology during World War II, as governments poured money into computer research. In 1946, this research led to the creation of the Electronic Numerical Integrator and Computer (ENIAC), a general-purpose computer that was developed to assist the US military in calculating weapon trajectories. The project was led by John P. Eckert and John W. Mauchly at the University of Pennsylvania. The enormous computer took up eighteen hundred square feet of space, weighed thirty tons, and used nearly eighteen thousand vacuum tubes. It relied on removable plug boards to communicate instructions. Reprogramming the computer involved rewiring the entire system, a process that took days.

A team of engineers at Texas Instruments created the first integrated circuit, or computer chip, in 1959. This extremely advanced miniature electronic circuit revolutionized computer science. In 1968, Robert Noyce, Gordon Moore, and Andrew Grove founded Intel Corporation with the intention of developing an affordable silicon-based memory chip. Intel would become one of the largest manufacturers of integrated circuits. By 1971, the invention of the floppy disk allowed computer programmers easily to store and transfer data. The earliest modern personal desktop computer was the Apple I, developed by Steve Wozniak in 1976. Wozniak and his partner, Steve Jobs, sold the Apple I as a complete unit. It was the first computer to feature a full keyboard and could be plugged directly into a standard television.

In 1981, IBM introduced the personal computer, or PC, which utilized the Microsoft Disk Operating System (MS-DOS) to run all of its applications. Every personal computer in the world can trace its lineage directly to IBM's first model. Just three years later, Apple launched the Macintosh, the first popular computer system to use a mouse to control an advanced graphical user interface (GUI). Since then, computers have become a ubiquitous part of modern life. Because of the constantly evolving nature of technology, the computer hardware and peripherals industry continues to offer creative entrepreneurs many opportunities to form successful new businesses. There are also many established companies that employ thousands of people, including engineers, scientists, production workers, and administrative staff.

The Industry Today

Computers are utilized in nearly every modern industry and have become a necessity for most households. The hardware and peripherals industry builds physical computer components in cooperation with the software industry, which develops programs that run on and control those components. In some instances, a single company may offer both hardware and software, either separately or as part of a single integrated computer system.

The term "peripherals" refers to devices that either allow users to input information or allow computers to make information externally available to users. Common peripherals include keyboards, monitors, computer mice, external storage devices, cameras, microphones, speakers, headphones, and projectors. Modern computer components require such precision in their design and construction that computer modeling and automated manufacturing techniques are necessary to create them. The testing of finished components is also handled through automated processes. Because of this high level of automation, the hardware industry hires few production workers. Moreover, the physical manufacturing process is increasingly being outsourced overseas. However, most research, design, and development of computers by American companies is still based in the United States.

Some computer hardware and peripherals, such as the specialized servers utilized by schools or hospitals, can be offered only by midsize or large firms because developing new products requires high expenditures on both research and manufacturing. Hardware and peripheral manufacturers often work closely with companies in the information technology and cyber-communications sectors to develop products for database systems, web servers, and computer networks. Hardware manufacturers also commonly produce specialized components and integrated circuits for use in other products, including automobiles, aircraft, medical devices, toys, and personal electronics.

The largest firms manufacture the majority of all computer components, such as integrated circuits and hard drives. Components usually require "clean-room" manufacturing facilities. In such facilities, employees are required to wear full-length coveralls and face masks and often pass through airlocks before entering. The atmosphere in clean rooms is carefully controlled with air filtration systems and particulate monitors. Commonly manufactured components requiring clean-room facilities include integrated circuits and hard drives.

Integrated circuits, also known as microchips, are created by blending multiple layers of semiconductor materials, such as silicon and germanium compounds, into a base structure. Transistors, which are capable of amplifying electrical signals and turning them on or off, are installed on the structure, and microscopic pathways are etched into the structure, connecting multiple transistors in the same manner that streets connect buildings in a city. Information is passed along these pathways as bursts of electrical impulses. Many different computer components rely on integrated circuits, including CPUs, motherboards, and video cards.

Hard drives store digital data for long periods of time. These data can be easily and quickly accessed at any time. Inside a disk drive, an actuator arm locates and reads data stored on magnetic disks as binary ones and zeroes. The actuator arm also contains recording elements that can write data to the disk by turning the magnetic charge of a tiny segment of the disk on or off. This segment represents one binary digit, or bit. However, as prices declined the industry began to move away from magnetic storage hard drives in favor of solid-state drives (SSDs) using flash memory for faster speeds, stronger reliability, and smaller size.

In both large and small firms, employees from many different departments, including research and development, engineering, manufacturing, and sales and support, work together closely to develop and perfect new products. A company may sell its products directly to consumers, or it may sell them wholesale to electronics or big-box retailers. In the competitive computer marketplace, companies succeed based on their dedication to innovative design and manufacturing techniques. As computers continue to become fully integrated in people's daily lives, the hardware and peripherals industry will play an increasingly important role in all sectors of the global business community.

In 1965, Intel cofounder Gordon Moore made a startling prediction: according to the so-called Moore's law, the number of transistors on an integrated circuit (and thus the rough computing power of the chip) will double every two years. Surprisingly, what may have seemed to be an overly ambitious forecast has held true into the twenty-first century. As transistors have continued to shrink and integrated circuits have become more powerful and more compact, new devices have been developed that blur the boundaries between computers, telecommunications equipment, consumer electronics, and even medical devices. For example, smartphones have more in common with computers than they do with Alexander Graham Bell's original telephone. Many devices, including tablet computers, touch-screen video gaming systems, and satellite navigation devices are becoming harder to categorize as computers, personal electronics, or household appliances. The rise of the "internet of things" (IoT) means that even items such as refrigerators and washing machines often have some level of computing capability.

Beyond the consumer market, critical organizations such as the US military—which initially funded much of the research that led to the development of early computers such as ENIAC—are increasingly dependent on computer hardware technology. Modern warfare involves constant communication among soldiers utilizing computers and digital satellite technology. Computers are also used by the military to control unpiloted aerial vehicles that perform rescue and reconnaissance missions and launch attacks into dangerous combat zones. Police officers, too, carry computers in their squad cars, and specially designed heavy-duty laptops have been created for use by police and military personnel in the line of duty. The ubiquity of computers demonstrates the deep importance of the hardware and peripherals manufacturing industry.

Industry Outlook

Overview

The computer has revolutionized the world, allowing nearly instantaneous access to almost limitless amounts of information and becoming a necessary component of the operations of almost all businesses. Businesses all over the world must stay current with modern trends in the computer hardware and peripherals industry or face being left behind by their competitors. Many older products that were once the standard have been replaced by better, faster, more efficient technologies. For example, vacuum tubes were replaced by transistors, and floppy disk drives by hard drives and flash drives. Data once held in massive libraries of paper punch cards can now be stored at a fraction of the cost in much smaller storage devices.

If Moore's law continues to hold true, transistors will shrink to the size of an atom in the near future. Research is currently being conducted into new experimental technologies, such as quantum computing, that would make such transistors possible, and quantum computers may one day be able to store data on a subatomic scale. Another experimental hardware technology that may one day revolutionize the field of computer science is the deoxyribonucleic acid (DNA) computer, which uses biomechanical genetic coding to store information. While these technologies are still in the early stages of development, it is likely that if they come to fruition they will become as firmly integrated into modern life as is the personal computer. In addition, many computer hardware and peripherals firms are researching new ways to minimize energy consumption and use environmentally sustainable manufacturing techniques.

Many of the computing innovations that are now taken for granted began as simple ideas of entrepreneurs, researchers, scientists, and even amateur hobbyists. Because of the rapid, ceaseless innovation that is the hallmark of this industry, there will always be new opportunities for creative advancement. Nobody can predict what the next great technological leap forward in computer hardware and peripherals will be, but it is certain that the industry will continue to evolve dramatically and its effect on daily life will continue to increase.

While spending on computer hardware and peripherals tends to fall during recessions, the computer industry is very resilient and tends to bounce back as soon as the economy begins to turn around. The computer hardware and peripherals industry has continued to grow overall, despite occasional slowdowns.

However, according to the market research firm IBISWorld, the global computer hardware manufacturing industry will contract in the future, despite the economy's anticipated recovery in the years following the COVID-19 pandemic. IBISWorld predicts that an increase in new technology will likely increase the demand for tablets and smart phones, which will slow the shipment of other computer products. Wearable technology and other mobile alternatives will also threaten the industry.

Employment Advantages

The amount of digitally encoded information will continue to increase exponentially for the foreseeable future, so demand for computer hardware and peripherals will continue to rise overall. In addition, computer usage is skyrocketing in many countries throughout the developing world, including China, India, and Brazil. This trend will further increase global demand for computer hardware and peripherals and therefore employment opportunities in the industry.

There are a wide range of jobs in computer hardware and peripherals manufacturing, from leading a start-up company to assembly line work in a factory. A general advantage is being part of a dynamic industry that has been, and continues to be, influential worldwide. Employees in successful businesses are likely to be relatively well paid, especially skilled engineers and experienced managers. The US Bureau of Labor Statistics (BLS) found the median annual pay for computer hardware engineers to be $132,360 in 2022. General electrical and electronics engineering technicians had a median annual salary of $66,390 in 2022. Jobs for computer hardware engineers are expected to grow at a faster than normal rate while jobs for general electrical and electronics and engineering technicians are expected to increase very little, if at all, from 2022 to 2032.

Annual Earnings

The computer hardware and peripherals industry is extremely cyclical and is frequently affected by economic bubbles and downturns. Because computer hardware and peripherals represent a major investment for most companies, sales usually drop during recessions. However, as the economy improves, demand for hardware products often rises dramatically. According to IBISWorld estimates, the industry as a whole had revenues of about $274.1 billion in 2023. This included $64.6 billion for semiconductor and circuit manufacturing, $60.7 billion for circuit board and electronic component manufacturing, $13 billion for computer peripherals manufacturing, and $10 billion for computer manufacturing.

Bibliography

Campbell-Kelly, Martin, et al. Computer: A History of the Information Machine. 3rd ed., Westview Press, 2014.

"Computer Hardware Global Market Report 2022." ReportLinker, Mar. 2022, www.reportlinker.com/p06246413/Computer-Hardware-Global-Market-Report.html?utm‗source=GNW. Accessed 23 Mar. 2024.

"Computer Manufacturing Industry in the US - Market Research Report." IBISWorld, June 2022, www.ibisworld.com/united-states/market-research-reports/computer-manufacturing-industry/. Accessed 23 Mar. 2024.

"Computer Peripheral Manufacturing Industry in the US - Market Research Report." IBISWorld, June 2022, www.ibisworld.com/united-states/market-research-reports/computer-peripheral-manufacturing-industry/. Accessed 23 Mar. 2024.

Cortada, James W. The Digital Hand: How Computers Changed the Work of American Manufacturing, Transportation, and Retail Industries. Oxford UP, 2004.

Eberts, Marjorie, and Margaret Gisler. Careers for Computer Buffs & Other Technological Types. 3rd ed., McGraw-Hill, 2006.

"Global Computer Hardware Manufacturing Industry - Market Research Report." IBISWorld, Sept. 2023, www.ibisworld.com/global/market-research-reports/global-computer-hardware-manufacturing-industry/. Accessed 23 Mar. 2024.

MarketLine Industry Profile: Computer Hardware in the United States. MarketLine, June 2015. Business Source Complete, search.ebscohost.com/login.aspx?direct=true&db=bth&AN=108359579&site=eds-live. Accessed 11 Jan. 2018.

MarketLine Industry Profile: Global Computer Hardware. MarketLine, June 2015. Business Source Complete, search.ebscohost.com/login.aspx?direct=true&db=bth&AN=108359576&site=eds-live. Accessed 11 Jan. 2018.

Swade, Doron. The Babbage Engine. Computer History Museum, www.computerhistory.org/babbage/. Accessed 11 Jan. 2018.

"Timeline." The Silicon Engine, Computer History Museum, www.computerhistory.org/siliconengine/timeline/. Accessed 23 Mar. 2024.

"Computer and Information Systems Managers." Occupational Outlook Handbook, Bureau of Labor Statistics, US Dept. of Labor, 26 Sept. 2023, www.bls.gov/ooh/management/computer-and-information-systems-managers.htm. Accessed 23 Mar. 2024.

Yost, Jeffrey R. The Computer Industry. Greenwood Press, 2005.

Yost, Jeffrey R. Making IT Work: A History of the Computer Services Industry. MIT P, 2017.