History of the US Health Care Industry

Health care has changed drastically throughout the history of the United States. Drawing on developments in medical science, business, education, and other fields, health care systems evolved into a highly complex industry with impact on virtually every aspect of society. This has led to many important public health breakthroughs but also serious challenges, especially issues of access related to the unique US health insurance system.

Health care has been part of human society since ancient times. While at its broadest the field can include any individual attempt to maintain or improve health, over the centuries it became associated with specialized professional efforts, from doctors and nurses to pharmacists, psychologists, and many others. In addition to the actual provision of care, the health care industry can also include the systems that make those services possible, such as education and training, financing, and political support. As such, health care is closely entwined with many broad social and economic forces.

By the time the United States was formed, Western medicine was built on centuries of study and practice. However, most medical practitioners were generalists and health care was still largely an informal, localized affair. Limited scientific knowledge meant that many conditions were unexplained or misunderstood and treatment options were often inadequate. Many years would pass before the medical profession clearly did more good than harm. Yet, in general, health care progressed as scientific inquiry emerged and practices became more standardized.

89550948-51724.jpg

The first hospital in the American colonies, Pennsylvania Hospital, was opened in 1751. Numerous pharmacies in the colonies marketed a wide variety of purported remedies—some of which contained such dangerous components as mercury, alcohol, or opium. By 1800 the United States had four medical schools, which were associated with the University of Pennsylvania, Columbia, Harvard, and Dartmouth. The number of medical schools expanded from five in 1810 to fifty-two by 1850. Some physicians attended formal medical colleges, such as the prestigious Jefferson Medical College in Philadelphia that was established in 1824. However, many medical practitioners learned their profession through apprenticeships or simply began practicing without any training. Massachusetts had delegated licensing to the state medical society, established in 1781, and leaving the matter to state medical societies became commonplace. Nurses and midwives were largely self-appointed. By 1850, there were about 41,000 physicians in the United States. This figure averaged out to about 176 physicians for every 100,000 people—a very high proportion by historical standards. The pay could be very good and it was relatively easy to become a doctor.

Scientific Advances

Medical science did advance, and knowledge of advances spread rapidly. Edward Jenner developed a successful vaccine for smallpox around 1800 in England. Massachusetts was quick to promote vaccination for smallpox, and New Hampshire required it from 1835. Smallpox was virtually wiped out as a result. Quinine was successfully produced in 1822 and became the standard treatment for malaria. In the early nineteenth century, gases such as ether and nitrous oxide began to be used as anesthetics in some procedures.

Scientific progress was accompanied by increasing formalization of health care professions. The American Medical Association (AMA) was formed in 1847 after half a century of growth of state and local medical societies. Some states and cities had similar groups much earlier. Medical journals spread information on diagnosis and treatments. The need for medical professionals and the rise in complex operations such as amputations during the US Civil War (1861–65) also contributed to the advancement and spread of modern medicine and related systems.

The development of germ theory in mid-nineteenth century Europe by figures such as Louis Pasteur in France profoundly improved medical science by turning attention to pathogenic microorganisms like bacteria and their role in infection and disease. Deaths from major infectious diseases such as tuberculosis, diphtheria, and measles accounted for about half of all deaths in the United States before 1880. From that point on, the death rate from infectious diseases began a rapid decline. Techniques of cleansing and sterilization, along with anesthetics, revolutionized surgery during the late nineteenth century.

Public Health

While improvements in medical treatment were important, widespread health improvements beginning in the late nineteenth century were more directly a result of a general improvement in nutrition and living standards, as well as the spread of public health measures. In colonial times, numerous government units created boards of health, concerned with sanitary conditions and contagious diseases. However, once again the lack of scientific understanding of many health issues hampered the effectiveness of these early efforts. As the population grew and society advanced, public health issues drew more and more scrutiny.

The obvious filth and stench developing in urban slums focused public attention on the need to upgrade water supplies and waste-disposal systems, partly for aesthetic reasons. Very few cities had sanitary sewers before 1880; most constructed them between then and 1910. A filtered water supply was virtually unknown in 1880, but such supplies reached more than 10 million people by 1910, and some areas had introduced chlorination. Major cities established boards of health. Water was inspected for bacteria, and pasteurization of milk became widespread. Following the example of Providence, Rhode Island, in 1880, public health laboratories became widespread by 1914. School districts instituted physical examinations and enforced compulsory vaccinations. The public schools increasingly spread information from the rising field of home economics, stressing the value of cleanliness, diet, and exercise. These public health measures led to the creation of numerous companies devoted to testing for contaminants and producing the equipment needed to improve sanitation.

The Early Twentieth Century

Around the beginning of the twentieth century, social and political reform movements often overlapped with issues relevant to health care. For example, muckraking literature drove further attention to issues of public health, such as unhealthy conditions in urban slums and the lack of regulation for food and medicine. Upton Sinclair’s novel The Jungle (1906), which helped expose unsanitary conditions in the meatpacking industry, was a factor leading to passage of a law creating the federal Food and Drug Administration (FDA) in 1906. The law also required that products be accurately labeled and forbade certain dangerous ingredients, and the importance of the FDA increased steadily in the following years.

There was a similar push for improved standards in medical education and practice. A groundbreaking study of American medical schools by Abraham Flexner (1910) found wide variation in quality. At the top, Johns Hopkins University had developed the first truly modern medical school (1893). At the bottom, Flexner recommended several schools be closed—and they were. State governments authorized their medical associations to approve medical schools and to examine and license physicians.

These reforms greatly upgraded the quality of the medical profession; however, they also made medical education much more lengthy and expensive. There had been 162 medical schools in the country in 1906; ten years later there were only 95, and the number fell further to 80 in 1923. Between 1900 and 1906, more than 5,000 students per year graduated from medical schools. After 1913, the number dropped below 4,000, and did not return to that level until 1927. From 1900 to 1906, there had been about 157 physicians for every 100,000 people, but this ratio dropped below 130 after 1923 and did not return to the previous level until the 1960s. The shortfall in physician numbers was somewhat offset by the more rapid expansion in the number of professional nurses. Nurses numbered about 50,000 in 1910; this number doubled in 1920 and doubled again in 1930, reaching about 214,000.

For most of the first half of the twentieth century, American medical practice fell into a simple pattern. Most doctors were family doctors, operating as individual practitioners out of a small office, often in the doctor’s home, and seeing patients both in their office and at the patient’s home. Diagnostic instruments were simple—a stethoscope, a thermometer, perhaps a blood-pressure cuff, sometimes X-ray equipment.

Medical costs were not high. A visit to the doctor might cost $5. In 1929, Americans spent about $3 billion for medical care. Half of this was for physicians or dentists. About $400 million went to hospitals. The number of hospitals rose rapidly in the first quarter of the twentieth century, then leveled off at roughly 6,000 for decades. Another $600 million of health care spending in 1929 was for medicines and other purchased medical items. The drugstore was a familiar Main Street establishment—there were about 58,000 of them during the 1920s, often with a soda fountain, a prescription department, and many over-the-counter (OTC) medicines. Notable among these was aspirin, a proven pain reliever whose many other useful properties were still being discovered. Miles Laboratories was a major supplier of OTC products, including Alka-Seltzer, whose comforting fizz promised relief for headaches or indigestion. Chain drugstores, such as Rexall, became widespread during the 1920s.

World War II

The 1940s represented a turning point in the American medical system. The discovery of penicillin and other antibiotics revolutionized the treatment of infections. New treatments greatly improved survival rates among soldiers wounded in military conflict. The pharmaceutical industry stepped up its efforts in research and development, aiding its rise as a key component of the broader health care system.

Within the federal government, the National Institutes of Health (NIH), which had been operating on a modest scale since the 1930s, experienced a rapid rise in its budget. NIH research expenditures rose from $33 million in 1952 to $274 million in 1960 and $893 million in 1969. The federal government also created a cabinet-level Department of Health, Education, and Welfare (HEW) in 1953.

World War II also set off major changes in the financing of medical expenses. Employers discovered they could bypass wage controls and high income tax rates by paying medical insurance costs for their workers. In 1948, insurance paid about 6 percent of personal health care costs. The insurance share rose rapidly, reaching 27 percent in 1960. As a result, the ultimate consumers became less sensitized to costs. Prices of medical goods and services began to rise more rapidly than other prices. Between 1950 and 1970, consumer prices in general increased by 61 percent, but medical costs rose 125 percent.

Medicare and Medicaid

The federal government’s role in the medical world changed dramatically in 1965 with the creation of Medicare and Medicaid. Medicare was a system of medical-expense insurance for people aged sixty-five and older. People became eligible either by paying Social Security tax (to which a Medicare premium was added) or by paying premiums directly. The adoption of Medicare had no appreciable effect on the health indicators of the elderly but greatly improved their financial condition. Medicaid covered medical expenses of eligible low-income persons of any age. About half of the people below the poverty line qualified for Medicaid.

The new federal programs encouraged the spread of health maintenance organizations (HMOs). These offered basic medical services to members for a fixed annual premium. In many cases, the HMO would pay its participating physicians a flat amount for each client enrolled. The Health Maintenance Organization Act of 1973 helped expand the scope of HMOs, viewed as an effective method of controlling costs through “managed care.”

Several health-related federal agencies were created: the Occupational Safety and Health Administration (OSHA, 1970), the Environmental Protection Agency (EPA, 1970), and the Consumer Product Safety Commission (CPSC, 1972). A symbol of the growing federal role was the creation in 1980 of a new Department of Health and Human Services, spun off from HEW.

With these new programs, the share of personal medical care expenditures in gross domestic product (GDP) moved steadily upward, from about 3.4 percent in 1960 to 6.6 percent in 1980 and 10 percent at the end of the millennium. Rising demand brought a steady increase in the number of medical schools and their graduates. In 1956, 82 medical schools produced about 7,000 graduates. By 1970, 107 medical schools produced almost 9,000 graduates. However, the supply did not keep up with the demand. As a result, many physicians immigrated to the United States, and some cost-conscious Americans went to other countries for treatment.

The continued rapid rise in medical costs drove up insurance premiums. Many employers stopped offering health insurance or shifted more costs to employees. The plight of the medically uninsured became a significant political issue. During Bill Clinton’s first term as president (1993–97), First Lady Hillary Clinton tried unsuccessfully to put together a program to expand medical insurance provided by the federal government. In 1997 Congress did create the State Children’s Health Insurance Program, which substantially enlarged insurance coverage for children. A complex prescription drug benefit was added to Medicare effective in 2006.

Twenty-First Century Developments

By 2000 the United States had developed a very large and diverse health care system. State and federal governments provided public health facilities such as a safe water supply, waste disposal, and inspection of goods, services, housing, and workplaces. Total health service employment increased from 9.3 million in 1990 to 12.7 million in 2000 and 14.9 million in 2006. The number of physicians increased from 615,000 in 1990 to 814,000 in 2000 and 902,000 in 2005. By 2005, one-fourth of all physicians had attended foreign medical schools.

By 2005 personal health care expenditures were about $1.7 trillion, of which 85 percent was covered by third-party (chiefly insurance) sources. There were 420 HMOs, enrolling about 69 million people. Medicare covered 42 million people and Medicaid 38 million. Fifteen percent of the population was not covered by medical insurance. Government programs of all kinds accounted for $747 billion of personal health care expenditures, representing about 44 percent of the total.

The high costs of medical care and the fact that many Americans lacked health insurance led to many calls for reform of the system, but there was deep political disagreement on how to do so. Some observers favored a shift toward some form of universal health care, following the example of most other wealthy industrialized nations. However, conservatives tended to prefer maintaining a system based on private insurance coverage, and some even sought to cut existing federal health care programs as a means of reducing government spending. The Democratic administration of President Barack Obama (2009–17) pursued a compromise solution, aiming to extend health insurance coverage to all Americans but stopping short of a single-payer scheme or other universal system. Despite intense Republican opposition, Congress passed landmark reform to this end with the Affordable Care Act (ACA, also widely known as "Obamacare") in 2010. Among other things, the ACA abolished the practice of charging higher insurance rates for, or altogether denying coverage to, patients with preexisting conditions; raised the maximum age at which a person could remain on their parents' insurance; and established government insurance subsidies for individuals and families with incomes below 400 percent of the poverty level. Studies soon showed that the ACA was very effective in reducing the number of uninsured Americans, but although its core policies eventually proved widely popular, the law would continue to attract much political controversy.

Meanwhile, most major indicators of health continued to show steady improvement. Life expectancy at birth, which was about forty-seven years in 1900, rose to seventy-four years in 1980 and about eighty years in 2016. These figures are strongly influenced by lifestyle factors such as smoking, automobile accidents, and violence. A better indicator of medical effectiveness is the number of years a sixty-year-old person is expected to live, which rose from fifteen years in 1900 to twenty years in 1980 and twenty-two years in 2003. Infant mortality, which was 100 per thousand in 1915 and 13 per thousand in 1980, dropped further to 5.8 per thousand in 2016. Ongoing improvements in medicine and health care technology, as well as public health efforts, were considered major contributors to this positive trend. However, some areas of health care did see growing concern. In particular, mental health issues earned greater recognition, and many experts suggested that societal changes such as the rise of the internet and social media brought notable risks.

While the medical and technological aspects of health care progressed, the social and economic aspects remained subject to intense partisan political debate. On the campaign trail for the 2016 US presidential election, Republican opposition to the ACA was a prominent issue. Yet despite the election of Republican Donald Trump, multiple efforts to repeal the law during his administration (2017–21) proved unsuccessful. Then, in early 2020, the emergence of the COVID-19 global pandemic thrust many health care issues into the spotlight. Not only did the massive public health crisis claim many lives (contributing to a sharp decline in US life expectancy and many other health indicators), it revealed how capacity issues and supply chain disruptions could quickly and severely strain hospitals and other health facilities across the nation. Front-line health care workers faced especially high risk of exposure to the highly contagious respiratory virus as well as struggles with burnout and other mental health challenges. And while the unprecedentedly rapid development of COVID-19 vaccines showed the power of medical science, vaccine hesitancy driven in part by politicization of the pandemic proved to be a serious growing problem. Many experts warned that distrust of mainstream medicine among certain segments of the population, stirred by unreliable online media and taken up by extremist politicians such as Trump, posed a major threat to public health.

Other health care issues that generated much political attention in the 2010s and 2020s included abortion and transgender health care. In general, liberals supported abortion rights and efforts to improve health care for transgender people, while conservatives opposed abortion and transgender care. Many state governments passed relevant legislation depending on the political party in power. The US Supreme Court decision Dobbs v. Jackson Women's Health Organization (2022), which overturned the constitutional right to abortion set in Roe v. Wade (1973), further galvanized action on both sides.

Racism and racial and economic disparities in health care were also widely discussed in the 2020s. The COVID-19 pandemic shed light on these issues. Racial and ethnic minorities were at greater risk of infection and death and more likely to require hospitalization. Possible factors suggested by experts were racism, other medical issues influenced by the stress of dealing with racial discrimination, the type of work people do, where people live, and their access to health care. The death of Olympic sprinter Tori Bowie, who went into labor and died in 2023, and the experience of tennis great Serena Williams, who had to insist on medical tests that saved her life after she gave birth in 2017, resonated with many minority women. Among all women, the United States has one of the worst maternal mortality rates of among wealthy countries. In 2022, the maternal mortality rate for Black women was 49.5 deaths per 100,000 births, while for White women it was 19 deaths. Experts say many of these deaths are preventable.

Like other countries, the United States was facing physician shortages that were projected to get worse by the mid-2030s. A study conducted in 2023 for the Association of American Medical Colleges anticipated a shortage of up to eighty-six thousand physicians by 2036. The study assumed investments in graduate medical education from states, Congress, teaching hospitals, and others would continue to build.

Bibliography

"About the Affordable Care Act." US Department of Health and Human Services, 17 Mar. 2022, www.hhs.gov/healthcare/about-the-aca/index.html. Accessed 29 Aug. 2023.

Barr, Donald A. Introduction to US Health Policy: The Organization, Financing, and Delivery of Health Care in America. Johns Hopkins UP, 2023.

Brangham, William, and Shoshana Dubnow. "American Black Women Face Disproportionately High Rates of Maternal Mortality." Public Broadcasting Service, 28 June 2023, www.pbs.org/newshour/show/american-black-women-face-disproportionately-high-rates-of-maternal-mortality. Accessed 2 May 2024.

Coddington, Dean C., Elizabeth A. Fischer, Keith D. Moore, and Richard L. Clark. Beyond Managed Care: How Consumers and Technology Are Changing the Future of Health Care. Jossey-Bass, 2000.

DeSimone, Daniel C. "Why Are People of Color More at Risk of Being Affected by Coronavirus Disease 2019 (COVID-19)?" Mayo Clinic, 6 Oct. 2022, www.mayoclinic.org/diseases-conditions/coronavirus/expert-answers/coronavirus-infection-by-race/faq-20488802. Accessed 2 May 2024.

Feldstein, Paul J. Health Care Economics. 5th ed., Delmar, 1999.

Henderson, James W. Health Economics and Policy. 2nd ed., South-Western, 2002.

Kongstvedt, Peter R. Health Insurance and Managed Care. Jones, 2015.

"New AAMC Report Shows Continuing Projected Physician Shortage." Association of American Medical Colleges, 21 Mar. 2024, www.aamc.org/news/press-releases/new-aamc-report-shows-continuing-projected-physician-shortage. Accessed 2 May 2024.

Pagano, Michael. Understanding Health Care in America: Culture, Capitalism and Communication. Routledge, 2021.

Rejda, George. Social Insurance and Economic Security. 6th ed., Prentice-Hall, 1999.

Rodriquez, Brittany. Health Insurance Affordability and the Role of Premium Tax Credits. Nova Science, 2015.

Shafer, Henry Burnell. The American Medical Profession, 1783–1850. Columbia UP, 1936.

Simmons-Duffin, Selena. "The CDC Says Maternal Mortality Rates in the U.S. Got Better, After a Pandemic Spike." National Public Radio, 2 May 2024, www.npr.org/sections/health-shots/2024/05/02/1248563521/maternal-mortality-rates-2022-cdc. Accessed 2 May 2024.

Simmons-Duffin, Selena. "'Live Free and Die?' The Sad State of US Life Expectancy." NPR, 25 Mar. 2023, www.npr.org/sections/health-shots/2023/03/25/1164819944/live-free-and-die-the-sad-state-of-u-s-life-expectancy. Accessed 29 Aug. 2023.

Smith, Timothy M. "How Legacy of Medical Racism Shapes U.S. Today." American Medical Association, 31 Jan. 2022, www.ama-assn.org/delivering-care/health-equity/how-legacy-medical-racism-shapes-us-health-care-today. Accessed 2 May 2024.

Stevens, Rosemary E., Charles E. Rosenberg, and Lawton R. Burns, editors. History and Health Policy in the United States: Putting the Past Back In. Rutgers UP, 2006.

"Timeline: History of Health Reform in the US." Kaiser Family Foundation, www.kff.org/wp-content/uploads/2011/03/5-02-13-history-of-health-reform.pdf. Accessed 29 Aug. 2023.