Y2K "Crisis"

Date January 1, 2000

Despite predictions of disaster to businesses, governments, and public services caused by expected malfunctions when computers confronted January 1, 2000, on their internal calendars, the worldwide transition to the year 2000 caused few problems, thanks to extensive preparations.

Also known as Millennium bug

Locale Worldwide

Key Figures

  • Grace Murray Hopper (1906-1992), inventor of the English-based computer language FLOW-MATIC, which evolved into COBOL
  • Robert Bemer (1920-2004), codeveloper of COBOL who was the first to publish warnings about the date problem hidden in most computer software
  • Peter de Jager (b. 1955), Canadian computer consultant who was an early, influential advocate of preparing for the Y2K transition
  • Daniel Patrick Moynihan (1927-2003), chair of the U.S. Senate committee that persuaded the federal government to prepare for Y2K
  • John A. Koskinen (b. 1939), chair of the President’s Council on Year 2000 Conversion

Summary of Event

Across North America and around the world, people waited nervously as midnight approached on New Year’s Eve, December 31, 1999. Many wondered whether predictions of doom about the year 2000 computer transition, popularly called the Y2K (for “year 2000,” with k representing the Greek kilo for “thousand”) problem or the millennium bug, would prove correct: Would power and water supplies fail, food distribution be disrupted, the economy begin to disintegrate, nuclear missiles launch accidentally, and widespread civil disturbances begin as computers and computer networks failed everywhere? No one was completely sure how to answer these questions, even though massive efforts to avert any possible problems occupied governments and businesses throughout the late 1990’s.

89316830-64621.gif

A definitive answer was apparent within days after January 1, 2000, came and went: There were no disasters. Some computer problems did occur on New Year’s Day and afterward, but they were so few, so inconsequential, and so easily corrected that even the most optimistic experts were surprised.

The story of the Y2K transition problem began with the development of commercial computing. In 1957, Rear Admiral Grace Murray Hopper invented a programming language called FLOW-MATIC, the first to be based on English in order to make computers easier for businesses to use. FLOW-MATIC formed the basis for COBOL, the name of which derived from “common business-oriented language.” The principal data storage device of the times was the eighty-column punch card. To conserve space, COBOL used only six digits to represent any given calendar date—two each for the month, the day, and the year, as in “04/15/53” for April 15, 1953. This shortcut dating method saved as much as twenty dollars in the production of a date-sensitive record, so it was an important way of economizing as businesses grew dependent on computers.

Computer scientists, led by Robert Bemer, one of COBOL’s developers, warned that using only two digits for each year designation would later cause problems and argued for a four-digit style. However, the desire of businesses to minimize their immediate expenses overwhelmed such objections. When International Business Machines (IBM) designed its System/360 mainframe computer (marketed in 1964), it incorporated the COBOL two-digit year format. That computer, and its dating style, became the industry standard. Bemer again published warnings about the dating problem in 1971 and 1979, but his protests stirred little interest and no change. To most businesses and government agencies the heart of the danger—the arrival of the year 2000—seemed too far away to worry about at the time.

In 1993, Peter de Jager, a Canadian computer engineer, published an article with the alarming title “Doomsday 2000” in Computerworld, a magazine aimed at technology managers. In that article and subsequent lectures, de Jager argued that the Y2K bug could initiate massive disruptions and plunge the economy into a recession. Computers, he pointed out, would read a date such as “01/01/00” as “January 1, 1900,” because there was no provision for numbers 2000 and higher in their software, and computer-processed date-sensitive information was fundamental to national infrastructures. There were already signs that he was right: That same year, a U.S. missile warning system malfunctioned when its computer clocks were experimentally turned forward to 01/01/00.

During the next seven years, other glitches turned up sporadically during testing. At the same time, with gathering momentum, attempts were under way to remedy the date problem. In 1996, Senator Daniel Patrick Moynihan of New York held committee hearings on the Y2K bug and directed the Congressional Research Service to study the potential problem. The report produced as a result helped to convince President Bill Clinton to establish the President’s Council on Year 2000 Conversion, directed by John A. Koskinen, in 1998. Koskinen oversaw programs to adjust the software used by government agencies. The U.S. government also ordered many organizations essential to the economy, such as stock brokerages, to fix the problem—that is, to “become Y2K compliant”—by August 31, 1999.

Despite initial skepticism about the true seriousness of the Y2K problem, big companies soon undertook remediation efforts of their own. Most employed one or more of three basic methods, termed “windowing,” “time shifting,” and “encapsulation.” Windowing, the most common, entailed teaching computers to read 00 as 2000 and to place other two-digit year dates in their appropriate century. Time shifting involved programming computers to recalculate dates automatically following a formula. Encapsulation, a refinement of time shifting, added 28 to two-digit years to synchronize computers with the cycles of days of the week and of leap years. January 1, 2000, for instance, would not fall on the same day of the week as January 1, 2005, and so adjustments were necessary to accommodate such discrepancies. All three techniques required exhaustive searches and reprogramming of mainframes and personal computers that processed time-sensitive information, such as pay schedules and product expiration dates.

Computer chips embedded in various kinds of equipment posed further difficulties. Since their introduction in the early 1970’s, microprocessors had been built into appliances, tools, automobiles, and machinery of all kinds: By the late 1990’s, they controlled the operations of nuclear power plants, utilities, hospital technology, weaponry, and climate control systems in buildings, in addition to such mundane devices as home microwave ovens. With between thirty-two billion and forty billion chips in use by 2000, their potential for causing trouble was enormous even if only a fraction of them controlled time-sensitive operations, and often the chips were difficult to extract and replace.

As the year 2000 approached, the frenzy of preparation increased, and predictions of disaster grew more ominous. Some consumers stockpiled generators, money, food, and fuel in case utility and supply systems became disrupted on January 1, 2000. Some government agencies failed to meet their August 31 deadline for Y2K compliance. Large corporations worried that their preparations were insufficient, and about a third of small American businesses made no preparations whatsoever.

When the moment of truth came and passed on New Year’s Day of 2000, no system failures occurred, and essential services were uninterrupted even in countries, such as Russia, that were both sophisticated in terms of the computer technology in use and largely unprepared for the date turnover. There were problems, however. Some were comical, as when a 105-year-old man was directed to attend kindergarten, some newborn children were registered as born in 1900, and the Web site of the U.S. Naval Observatory, the government’s official timekeeper, proclaimed the date as “January 1, 19100.”

Most problems were simply annoyances. Some records were accidentally deleted, software used to service credit cards double-charged some users, renters returning videos that were one day overdue were billed for thousands of dollars in late charges, and cell phone messages were lost. Most such problems were easily corrected. Other problems were potentially more serious. For example, one Wall Street computer inflated a few stock values, and a small number of company security systems failed. Some satellites, including one U.S. spy satellite, lost contact with their controllers. Software modifications and simple common sense were sufficient to rectify the errors.

The Y2K problem did not end with the New Year’s date turnover, however. One expert calculated that only about 10 percent of the problems would turn up immediately. For instance, the leap year day February 29, 2000, caused at least 250 glitches in seventy-five countries, although none was major.

Significance

Even though the year 2000 turnover passed without disaster, the event itself and the preparations for it revealed how thoroughly modern society had come to rely on a sophisticated technological infrastructure. Controlling and coordinating that infrastructure are computers and, increasingly since about 1990, computer networks, especially the Internet. The Y2K threat to information technology (IT) elicited one of the largest and most effective joint responses among businesses and government agencies in U.S. history as well as extensive international cooperation. Programmers successfully corrected well over 95 percent of Y2K-related problems. People around the world, particularly Americans, became more keenly aware of their dependence on computers, but they also learned that managing computers is not beyond their control.

Because of its very success, the remediation effort had its critics, some of them bitterly vocal. In part, critics wondered how so little could go wrong if the Y2K bug had really been as big a threat as IT experts had insisted. Editorials and letters to the editors of business periodicals accused the large coterie of Y2K experts of exaggerating the danger in order to scare businesses into spending money unnecessarily on remediation. They denounced the media hoopla and claimed that the predictions of doom had been psychologically harmful.

Critics were also outraged by the price of remediation. In 1993, de Jager estimated that addressing the problem would cost between $50 billion and $75 billion worldwide. He was far too conservative. The United States alone spent $100 billion, including $8.5 billion by the federal government, according to the U.S. Department of Commerce. The worldwide bill was estimated at between $500 billion and $600 billion. De Jager and his colleagues admitted that costs may have been unnecessarily high, but they insisted that the money was well spent, because without remediation widescale systems malfunctions would have occurred, costing much more money to repair and causing civil disorder. The controversy created a measure of ill will between businesses and IT specialists.

In addition to avoiding disaster, Y2K remediation had immediate tangible benefits for some segments of society. The rush to stockpile food and equipment before the New Year brought record profits to some manufacturers and retailers. Computer programmers were in high demand, and consultants earned money with books, articles, lectures, and Web sites offering advice. Companies were launched specifically to solve Y2K problems for businesses; many of them afterward diversified to serve the general needs of electronic commerce. The close scrutiny of programmers benefited companies’ overhead expenses as well. Programmers removed the clutter of computer code that had accumulated during decades of reprogramming and computer upgrades and uncovered applications that could be eliminated, streamlining business computer systems. Many companies learned how to conduct contingency planning for IT malfunctions. Others, especially small businesses, learned how to use computers effectively for the first time.

Less tangible, but at least as important, were two general lessons for businesses and governments. First, they were forced to reevaluate their dependence on technology, to understand the complexity of that technology, and to be aware of the danger to the technology from unforeseen conditions, such as the Y2K date problem. Second, they learned dramatically that forty years of development and use had built a computer infrastructure with serious inconsistencies and imperfections. Accordingly, commentators suggested that IT specialists, especially those developing large projects, should undergo certification to ensure coherent planning.

The President’s Council on Year 2000 Conversion was demobilized after February 29, 2000, but the Y2K bug continued to have direct and indirect effects on business. Many organizations had deferred computer data entry and innovations in order to devote employee time to Y2K remediation, so following the New Year, they had to clear up the work backlog. Moreover, according to de Jager and other analysts, the programming techniques used to remedy Y2K dating problems were stopgaps, often not coordinated between computer systems and potentially only temporarily effective. Windowing and time shifting could insinuate subtle changes into computer codes, changes that might not cause problems for decades.

Bibliography

De Jager, Peter. “Y2K: So Many Bugs . . . So Little Time.” Scientific American (January, 1999): 88-93. Presents a thorough technical explanation of the computer problems, geared toward business computing and record keeping. Stands as an example of a forecast about Y2K that was far too pessimistic.

JD Consulting. Y2K Procrastinator’s Guide. Rockland, Mass.: Charles River Media, 2000. Introduction lucidly explains the source and nature of the date problem in business computers and embedded computer chips.

Kuo, L. Jay, and Edward M. Dua. Crisis Investing for the Year 2000: How to Profit from the Coming Y2K Computer Crash. Secaucus, N.J.: Birch Lane Press, 1999. Offers a balanced summary of the computer problem for businesspeople and then discusses potential economic developments in detail.

McGuigan, Dermot, and Beverly Jacobson. Y2K and Y-O-U: A Sane Person’s Home-Preparation Guide. White River Junction, Vt.: Chelsea Green, 1999. An example, sensible and practical, of the better-safe-than-sorry advice offered to people worried about the millennium transition.

Yourdon, Edward, and Jennifer Yourdon. Time Bomb 2000. 2d ed. Upper Saddle River, N.J.: Prentice Hall, 1999. An example of a gloomy assessment of Y2K risks to most segments of society.