Nuclear Technology

Summary

Nuclear technology focuses on the particles composing the atom to produce reactions and radioactive materials that have practical use in such areas as agriculture, industry, medicine, and consumer products, as well as in the generation of electrical power and the construction of nuclear weapons.

Definition and Basic Principles

Nuclear technology refers to technology that utilizes the energy produced by the reaction of atomic nuclei. The potential of nuclear technology was first identified in the early twentieth century and, by mid-century, had developed into several practical applications.

89250531-19870.jpg

The scientific principles employed in nuclear technology grew out of the initial research on radium conducted during the early twentieth century by Henri Becquerel and Marie and Pierre Curie and their daughter, Irène Joliet-Curie. This research involved the alpha-, beta-, and gamma-ray activity of radium. The subsequent research of their successors focused on manipulating the relationship between the proton, neutron, and electron properties of the atom of radioactive elements to produce chain reactions and radioactive isotopes of value in numerous fields. Most of the scientific principles involving the peaceful use of nuclear technology relate to the process of fission, an energy- and neutron-releasing process in which the nucleus of an atom is split into two relatively equal parts. The production of nuclear weapons also builds on these principles but in addition to fission, it involves the fusion of several small nuclei into a single one, the mass of which is less than the sum of the small nuclei used in its creation.

Since the use of atomic weapons in World War II, a second set of principles relating to the technology has also emerged—one designed to govern its application in a manner beneficial to humanity. The most important of these applied principles pertain to the beneficial and responsible use of nuclear technologies in a manner mindful of human and environmental safety, the technology's continued improvement in terms of efficiency and safety, and the securing of research information and nuclear material from acquisition by those who might use them for destructive purposes.

Background and History

Most accounts of the nuclear age's roots begin with the 1896 work of the French physicist Henri Becquerel, who is credited with discovering radioactivity while exploring uranium's phosphorescence. The research center in the field remained in France for decades thereafter, most notably in the ground-breaking work of Pierre Curie, and that of his wife, Marie, and their daughter Irène Joliot-Curie following Pierre's death in 1906. It is Pierre Curie who is credited with coining the term “radioactivity,” and it was his and his wife's work on the properties of decaying uranium that provided the foundation for the research on nuclear fission and nuclear fusion, which eventually led to the creation of the atomic bomb, nuclear electricity-generating power plants, and the radioisotopes so abundantly useful in medicine, industry, and daily life.

By the 1940s, research conducted a decade earlier by British physicist James Chadwick, Italian physicist Enrico Fermi, German chemist Otto Hahn, and others on the unstable property of the atom had progressed sufficiently for scientists to envision the development of nuclear weapons that could, through an induced chain reaction, release enormous amounts of destructive energy. A wartime race to produce the atomic bomb ensued between Germany and the eventual winners. These German, British, and American scientists collaborated in the United States' Manhattan Project under the direction of American physicist Robert Oppenheimer. Using reactors constructed in Hanford, Washington, and Oak Ridge, Tennessee, to create weapons-grade uranium and plutonium, the project's scientists tested the first atomic bomb on July 16, 1945. Shortly thereafter, the detonation of atomic bombs on the Japanese cities of Hiroshima (August 6) and Nagasaki (August 9) led to Japan surrendering unconditionally, ending World War II. However, the resultant peace was short-lived. By 1947, the wartime alliance between the United States and the Soviet Union had dissolved into an intense competition for global influence. When the Soviet Union exploded its first atomic weapon in 1949, it began a nuclear arms race between the two countries that both threatened the safety of the world and kept it in check out of the mutual recognition by the two superpowers that given such weaponry, no nuclear war could be “won” in any meaningful sense.

Meanwhile, research in the field of nuclear technology began to focus on the use of nuclear energy to generate electricity. This captured the attention of world leaders who were concerned about their ability to meet the anticipated postwar demand for electricity in their growing cities. That feat was accomplished on a test basis near Arco, Idaho, on December 20, 1951. The first nuclear power station went online in the Soviet Union on June 27, 1954, and two years later, the first commercial nuclear power station opened in Sellafield, England. The United States joined the world, commercially producing nuclear power in December 1957 with the opening of the Shippingport Atomic Power Station in Pennsylvania. By then, the United Nations (UN) had already convened to explore the peaceful use of nuclear technology. Six Western European countries were banding together to form the European Atomic Energy Community (Euratom), a supranational body committed to the cooperative development of nuclear power in Europe. The UN created the International Atomic Energy Agency (IAEA) to encourage the peaceful use of nuclear technology on an even broader basis.

In retrospect, it appears that the early zeal in opening nuclear energy plants inadvertently paved the way for the technology's declining appeal by the end of the century. The early emphasis was on constructing and making operational a growing number of power plants, with a resultant neglect of safety issues in the choice of reactor design, the construction of the plants, and the training of the technicians charged with operating them. The United States Atomic Energy Commission (AEC), for example, was charged with both promoting and regulating the commercial use of nuclear energy. It was a dual mandate in which the first charge invariably got the better of the second, most notably demonstrated in 1979's Three Mile Island accident in Middletown, Pennsylvania, when the rush to get the plant online before all tests were performed resulted in a construction-flaw-induced accident compounded by human error that did much to dampen the appeal of nuclear power in the United States. Seven years later, human error merged with a flawed reactor design to cause a disastrous explosion at the Soviet nuclear power plant in Chernobyl. The disaster severely undercut the appeal of nuclear power throughout most of Western Europe.

The military utility of the atom demonstrated during World War II also encouraged its continued pursuit in the military field in the United States, where nuclear reactors were harnessed to propel fleets of nuclear submarines and aircraft carriers that remain a mainstay of the U.S. Navy. Nonetheless, the real diffusion of nuclear technology has occurred in the civilian field and as a result of scientific and political events separated by nearly a generation. In 1934, Irène Joliot-Curie and her husband, Frédéric Joliot-Curie, discovered that radium-like elements could be created by bombarding materials with neutrons (“induced radioactivity”), a discovery that eventually led to the inexpensive production of radioisotopes.

After World War II ended, though, there was a concerted postwar effort by governments to tightly control all research on nuclear technology.

It was not until 1953, when President Dwight Eisenhower proposed a broad sharing of information in his “Atoms for Peace” speech at the UN. That was a significant declassification of information after which the fruits of research in the field of nuclear technology became widely available and began to be utilized in myriad areas.

How It Works

The means by which nuclear technology is applied in the various arenas surveyed below varies from sector to sector. In general, however, nuclear technology produces its benefits either by altering the activity or weight of nuclear particles or by exposing nonnuclear matter to radiation.

The nuclear power plants that generate electrical power, for example, like those that burn fossil fuel, function by heating water into steam in order to turn the turbines that produce electricity. The fuel consists of uranium oxide (commonly known as yellowcake) processed into solid ceramic pellets and packaged into long vertical tubes that are inserted into reactors to produce a controlled fissile chain reaction. Either pressure or cold water is utilized to control reactor heat and the intensity of the reaction.

By contrast, atomic weapons rely on generating a uranium chain reaction of an intentionally uncontrolled nature for maximum destructive effect, with the principle of fusion (the compressing of the atom into a smaller particle in order to produce energy) being exclusively utilized in the production of the more powerful thermonuclear bombs.

Elsewhere, the industrial use of radioisotopes rests on the fact that radiation loses energy as it passes through substances. Manufacturers have consequently been able to develop gauges to measure the thickness and density of products and, using radioisotopes as imaging devices, to check finished products for flaws and other sources of weakness. For their part, the fossil fuel industries involved both in mining and oil and gas exploration are using radioactive waves that measure density to search for resource deposits beneath the soil and sea. The medical community, the agriculture industry, and the producers of consumer goods that use nuclear technology largely rely on radioisotopes—more specifically, on exposing selected “targets” to radioisotope-containing chemical elements that can either be injected into a patient's body to “photograph” how an organ is functioning or employed to destroy undesirable or harmful elements.

Applications and Products

Although for some, the mention of nuclear technology is most likely to conjure up threatening images of mushroom clouds or out-of-control nuclear power plants, nuclear technology has become a daily part of the lives of citizens in much of the developed world.

Nuclear Power Industry. The nuclear-based power industry that has emerged in the United States, United Kingdom, France, and more than twenty-five other countries around the globe encompasses more than 440 nuclear power plants and produces nearly 10 percent of the world's electrical output. By 2022, the construction of new plants increased after a downturn, and by 2024, sixteen countries had plants under construction. In the United States, over ninety nuclear power stations generate more electricity than any fuel source and have assisted with the electrical needs of a steadily growing population.

Space Exploration. Space exploration has also substantially profited from nuclear technology—in particular, the development of radioisotope thermoelectric generators (RTGs) that use plutonium-generated heat to produce electrical power for unmanned space travel since the launch of Voyager 1 in 1977.

Medicine. The main applications of nuclear technology in medicine are in the areas of diagnostic imaging—principally the use of positron emission tomography (PET) and single photon emission computed tomography (SPECT) scans—and radiation in the treatment of cancer and other diseases.

Industry. The centerpiece of nuclear technology in the industrial field revolves around the diagnostic use of lasers and radioisotopes in order to improve the quality of goods, including the quality of the steel used in the automotive industry and the detection of flaws in jet engines.

Agriculture and the Pharmaceutical Industry. Nuclear technology is also used in these sectors to test the quality of products. The U.S. Food and Drug Administration (FDA), for example, requires testing of all new drugs, and 80 percent of that testing employs radioisotopes. However, radiation is also widely used to treat products, especially in agriculture, where an irradiation process exposes food to gamma rays from a radioisotope of cobalt 60 to eliminate potentially harmful or disease-causing elements. Even livestock products are covered. Like its counterparts in at least ten other countries, the FDA approves irradiation for pork, poultry, red meat, fruits, vegetables, and spices to kill bacteria, insects, and parasites that can lead to such diseases as salmonella and cholera.

Mining, and Oil and Gas Exploration. The process of searching for valuable natural resources has been radically altered in the last generation by the introduction of radiation wave-based exploratory techniques. Nuclear technology is also important to resource recovery and transportation in these industries. Lateral drilling, for example, relies on radiation wave directives to tap into small oil deposits, and construction and pipeline crews routinely use radiation waves to test the durability of welds and the density of road surfaces.

Consumer Products. Virtually every American home contains several consumer products using nuclear technology, from nonstick pans treated with radiation to prolong the life span of their surfaces, to photocopiers and computer disks that use small amounts of radiation to eliminate static, to cosmetics, bandages, contact-lens solutions, and hygiene products sterilized with radiation to remove allergens.

Careers and Course Work

Given the breadth of the applications of nuclear technology, career options lie in almost every field, but almost all require a college degree involving specific technical training, especially in fields such as nuclear chemistry, nuclear physics, nuclear engineering, and nuclear medicine. For the more specialized areas, a career in either nuclear technology research or in development and application tends to require one or more advanced degrees. Given the complex, cutting-edge nature of such work, graduate training is often important for even nuclear technician positions. Jobs nonetheless remain plentiful in most sectors for those who have that training, both in government (maintaining and operating the Navy's nuclear fleet) and in the private sector. The power industry routinely advertises its need for design engineers, process control engineers, technical consultants, civil, mechanical, and electrical nuclear engineers, and nuclear work planners; the medical community constantly seeks radiologists and other personnel trained in nuclear technology; and nuclear technicians remain in high demand in consumer product manufacturing, mining, and agriculture.

Public administration careers should not be ignored. National and local government entities such as the U.S. Department of Energy, the Nuclear Regulatory Commission, oversight agencies at the state level, and their counterparts in other countries are also career outlets for those combining business administration or public administration training with knowledge of nuclear technologies.

Social Context and Future Prospects

The application of nuclear technology is progressing on three pathways, the first of which is of serious global concern.

Students of international affairs have long been concerned with the problem of “runaway” nuclear proliferation—the acquisition of nuclear weapons by so many states that others will also feel the need to acquire them, trebling the number of nuclear-armed states in a short time and making an accidental or intentional nuclear war more likely. The presence of stateless terrorist organizations who are willing to engage in extremist activity involving high kill numbers has significantly elevated this concern. Until the 2000s, the pace of proliferation was incremental, and those who acquired the weaponry were sometimes reluctant to publicize the fact. The acquisition of atomic weapons by Pakistan, North Korea, and Iran has raised international concerns.

The second track holds considerably more potential for gooda renewed interest in nuclear power to meet the world's growing electrical needs. As a source of electrification, nuclear power fell largely out of fashion during the late twentieth century in much of the world as a result of the cost of building and maintaining nuclear power plants compared with the cheap cost of imported energy between 1984 and 2003, the anti-nuclear movement and the public's concern about the construction of nuclear power plants in their backyards, and the appeal of environmentally friendly, renewable green energy sources during the era of rising oil prices that followed the US-led invasion of Iraq in 2003.

The prospect of employment in nuclear power remains good for three reasons. First, research and development activity has resulted in applying techniques that have prolonged the life span of existing nuclear power plants well beyond their intended use cycle. Trained personnel are needed at all levels to continue that research and safely operate those nuclear power plants. Second, modern green technologies are unlikely to be able to power the giant electrical grids that are increasingly being demanded by megacities around the world, especially in developing countries. Such technologies will not meet the global demand for ever more electrical power, increasing at about 1 percent per year in the United States and at a far higher rate in developing areas. Large-scale nuclear power plants can meet those needs while adhering to the environmental standards that northern and southern hemisphere states have committed themselves to.

Finally, there is the broad umbrella area of civilian societal applications, where a continuing high demand for nuclear technology in the field of medicine, virtually every area of industry, agriculture, and consumer products can be predicted with far greater assurance than the future demand for nuclear power as a source of electrification. In fact, so assured is the presumption of a steadily growing demand for nuclear-based products in medicine alone that it is driving much of the interest in constructing new reactor facilities just to produce the materials used in radiation-based therapies.

Bibliography

Angelo, Joseph A., Jr. Nuclear Technology. Greenwood, 2004.

Morris, Robert C. The Environmental Case for Nuclear Power: Economic, Medical and Political Considerations. Paragon House, 2000.

"Plans For New Reactors Worldwide." World Nuclear Association, 30 Apr. 2024, www.world-nuclear.org/information-library/current-and-future-generation/plans-for-new-reactors-worldwide.aspx. Accessed 11 June 2024.

Shackett, Peter. Nuclear Medicine Technology: Procedures and Quick Reference. 3rd ed., Lippincott, 2020.

"The Many Uses of Nuclear Technology." World Nuclear Association, 30 Apr. 2024, www.world-nuclear.org/information-library/non-power-nuclear-applications/overview/the-many-uses-of-nuclear-technology. Accessed 11 June 2024.

Hamblin, Jacob D. The Wretched Atom: America’s Global Gamble with Peaceful Nuclear Technology. Oxford University Press, 2021.

United States Congress, House Committee on Foreign Affairs, Subcommittee on Terrorism, Nonproliferation, and Trade. Isolating Proliferators, and Sponsors of Terror: The Use of Sanctions and the International Financial System to Change Regime Behavior. Government Printing Office, 2007.

Yang, Chi-Jen. Belief-Based Energy Technology Development in the United States: A Comparative Study of Nuclear Power and Synthetic Fuel Policies. Cambria, 2009.