First Factoring of a 100-Digit Number

First Factoring of a 100-Digit Number

On October 11, 1988, a 100-digit number was successfully factored for the first time. In mathematics, factoring is the process of determining how a large number may be divided by smaller numbers without leaving any remainder. For example, the number 20 has the factors 1, 2, 4, 5, 10, and 20 itself, because all those numbers divide cleanly into 20 without leaving any remainder. Nonfactors do not divide cleanly, and so, for example, the number 3 is not a factor of 20 because it divides into 20 six times (3 goes into 20 six times, and 6 times 3 equals 18) but leaves a remainder of 2 (20 minus 18 equals 2). Factoring a number—determining all of its factors, such as for the number 20 above—can be enormously difficult for large numbers but has practical applications beyond the realm of pure mathematics in the field of code-breaking. Modern cryptography is now generally based on extremely complex numerical calculations, and factoring can help break these codes down or help design more secure ones.

Using hundreds of computers networked from the United States, Western Europe, and other places around the world, scientists and mathematicians led by teams from the University of Chicago and the Digital Equipment Corporation were able to factor a 100-digit number for the first time. The number they factored was 11 raised to the 104th power, plus 1.