Quantum Computation
Quantum computation is a cutting-edge area of study that combines principles from computer science and quantum mechanics to revolutionize the way data is processed. Unlike traditional computers, which use bits that are either 0 or 1, quantum computers utilize qubits—quantum bits that can exist in superpositions of states. This means that a qubit can represent multiple values simultaneously, allowing quantum computers to perform certain calculations much faster than classical computers. The power of quantum computation lies in its ability to exploit quantum phenomena like superposition and entanglement, enabling it to solve complex problems such as factoring large numbers and searching databases more efficiently.
The implementation of quantum computation faces significant challenges, notably the decoherence problem, where qubits lose their quantum state due to external interactions. Researchers are exploring various physical systems to create stable qubits, including ion traps and superconductors. The potential applications of quantum computing are vast, ranging from secure communication through quantum cryptography to breakthroughs in drug discovery and optimization problems. While practical, large-scale quantum computers are still in development, the ongoing research holds promise for transforming technology and computation in the future.
Subject Terms
Quantum Computation
Type of physical science: Quantum Computation, Cryptography, Computers, Quantum mechanics, Atomic physics
Field of study: Nonrelativistic quantum mechanics
A quantum computer is a device that makes use of quantum-mechanical phenomena to process data. As a consequence of the fact that atomic systems are not in a definite state until measured, quantum computers can perform certain functions in fewer operations than would be needed to perform the computation on a classical digital computer.
![Quantum computer.jpg. Qubits are made up of controlled particles and the means of control (e.g. devices that trap particles and switch them from one state to another). There are 4 established qubit candidates: ion traps, quantum dots, semiconductor impurities, and superconduct. Jbw2 at en.wikipedia [GFDL (http://www.gnu.org/copyleft/fdl.html), CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0/) or CC-BY-2.5 (http://creativecommons.org/licenses/by/2.5)], from Wikimedia Commons 89317172-89562.jpg](https://imageserver.ebscohost.com/img/embimages/ers/sp/embedded/89317172-89562.jpg?ephost1=dGJyMNHX8kSepq84xNvgOLCmsE2epq5Srqa4SK6WxWXS)
![DWave 128chip.jpg. Photograph of a chip constructed by D-Wave Systems Inc. designed to operate as a 128-qubit superconducting adiabatic quantum optimization processor, mounted in a sample holder. By D-Wave Systems, Inc. (D-Wave Systems, Inc.) [CC-BY-3.0 (http://creativecommons.org/licenses/by/3.0)], via Wikimedia Commons 89317172-89563.jpg](https://imageserver.ebscohost.com/img/embimages/ers/sp/embedded/89317172-89563.jpg?ephost1=dGJyMNHX8kSepq84xNvgOLCmsE2epq5Srqa4SK6WxWXS)
Overview
Quantum computation is an interdisciplinary study that merges the fields of computer science, experimental particle physics, and theoretical physics. A computer is a device that takes an input problem specified by some number of bits, which in digital computers correspond to numerical values of either 0 or 1; applies a series of logic gates to that input to process it; and then outputs a result. Classical digital computers are perfectly predictable and deterministic. If the computer is working correctly, a given input should always produce the same output. At any given stage in the calculation, the computer is in some definite state.
Alan Turing was the first scientist to think about the question of what classical computers could do. He showed that an idealized computer with an infinite memory tape, later called a Turing machine, could find the answers to all solvable mathematical problems in some finite amount of time with some finite amount of states. However, that finite amount of time might be very long, and the finite amount of states very large.
The first results in quantum computation, produced by David Deutsch, showed that a quantum computer could not solve any problems that could not be solved by a classical Turing machine. However, there are certain problems that would be intrinsically faster on a quantum computer. In other words, the total number of steps needed to solve the problem, the computational complexity, would be smaller than the number of steps required for the classical computer.
Classical computers may be built of any material, from transistors to Tinkertoys, as long as that material is capable of staying in a definite state. When very small objects on the scale of atoms are considered, this requirement no longer holds. Atoms are governed by a kind of physics called quantum mechanics, in which the interaction of atoms requires that when they are not being observed, atoms exist not as particles occupying a discrete place but as waves that can interfere with one another. The behavior of a set of atoms, while governed by rules, is not the perfect determinism required for the above definition of a computer, because atoms are not perfectly predictable. At any given stage in the calculation, the computer may be in a mixture of states.
Instead of typical digital bits, information is entered into a quantum computer in the form of qubits, or quantum bits. These qubits can be represented by any two-state quantum system, meaning a system that can exist in a superposition of two different quantum states. They might be photons, with the two states being horizontal and vertical polarization; or hydrogen atoms, with the two states represented by whether the atom's electron is in a ground state or an excited state. In the latter example, if the atom is in an excited state, having just absorbed a photon, that state could be called 1; if it is in the ground state, having emitted a photon, that state could be encoded as 0. In order to write to the system, one would bathe the atom in laser light made up of photons having energy equal to the difference between the energy of the ground state and the energy of the excited state. If the atom is in the excited state, then it will go to the ground state, and if it is in the ground state, then it will go into the excited state. Unlike in a classical system, however, it is possible to apply the laser light for half the time it would take to put the atom into the other pure state, leaving the qubit in a superposition between the two classical states. A machine that has inputs or intermediate states that are superpositions of digital values is the definition of a quantum computer.
What is superposition? Imagine a waiter carrying a cup of hot coffee across a room. The rhythm at which he walks will impart a certain oscillation to the cup, which will increase as he walks. If he moves the cup back and forth gently as he walks, and this motion is at any frequency other than his walking frequency, the oscillations will subside, because the coffee will be in a state that is the superposition of the two different wave states. This superposition is produced because there is destructive interference between the waves.
Quantum superpositions differ from classical ones in that they are never observed. When an attempt is made to observe a quantum superposition, generally one of the waves that compose it is produced at random, as if a die had been thrown. An ideal "quantum cup of coffee" will spill if the waiter happens to look at it while he is carrying it, even if he is careful. He can, however, carry it without spilling indefinitely as long as no one looks at it.
Real-world systems, which for the most part have been limited to only a handful of qubits, have implemented those qubits via such systems as liquid-state nuclear magnetic resonance (NMR), in which molecules are dissolved in a solution and the qubits are the molecules' nuclear spin states; solid-state NMR, in which the qubits are the nuclear spin states of phosphorus atoms held in silicon; ion traps, in which the qubits are either the ions' hyperfine ground states (hyperfine qubits) or the ground state and the excited state (optical qubits); and Josephson junctions, in which the qubits are the states of small circuits composed of two weakly linked superconductors, among others.
Perhaps the simplest computation imaginable is adding one and one to make two. This can be implemented by two bits and a gate called an AND gate. The two bits are routed into the gate by a circuit. The gate produces a 1 if both inputs are 1; if they are not, it produces a 0. It can be shown quite easily with truth tables that a combination of a controlled-NOT (CNOT) gate and a NOT, or inverter, gate can produce any transformation whatsoever, including the AND gate. Since any two-bit value or superposition of bit values can be transformed into any other two-bit value or superposition, these gates are a universal set of logic gates, adequate to do any computation that is Turing computable if enough gates and bits are available.
What does implementation of a quantum logic gate mean? There are no quantum circuits; no electrical pulse is moving from place to place. Rather, pulses of electromagnetic energy of very precise frequency are applied one after another to pairs of qubits in order to leave them in the output state desired. This process is straightforward if there are only two qubits in the system; the difficult question is, how can any two qubits be independently addressed out of a collection of several hundred qubits?
The above problem is called the interconnection problem. It is necessary for two qubits to have a coupling between them when one gate is applied but not when some other gate is applied, when both qubits may need to be coupled to some other qubits. One possible solution to this problem, as demonstrated in 2011 by a research team at Harvard University, is to use an atomic-force microscope (AFM) with a tiny but powerful magnet at the scanning tip in order to manipulate individual electron spins, in a system in which the electron spin state constitutes the qubit.
What is the value of having qubits in the superposed states? How does this reduce the number of steps needed for certain calculations? Each qubit in a superposition is, in some sense, trying out both possible values in each subsequent operation on the qubit. The value that the qubit ultimately takes when the state of the system is measured and the superpositions collapse is entangled with the values of all other qubits that have previously interacted with it. The final output value depends not only on a single sequence of calculations (the number of steps actually taken, as in classical computation) but also on all possible steps. The output of a quantum computer's qubits is a randomly assigned state from a series of possible answers, with the provision that there is a high probability of selecting the correct answer from all the answers generated by possible sequences of calculation.
This account assumes that there are certain features of the correct path of computation that are shared with the incorrect paths. These features correspond to wave properties of the qubit states that produce constructive interference, making the correct path stand out as a signal over the background noise of the incorrect paths, which are canceled out by destructive interference. Such features have only been found in the relationship of correct paths to incorrect paths of certain problems devised especially for quantum computers, such as a special way of factoring numbers. In order to be sure of a quantum computation, even in a perfectly well-functioning machine, the computer must be run several times and a majority vote must be taken on the outcome, as was done in the early days of digital computing.
The main challenge to effective quantum computation is called the decoherence problem. Any interaction with a qubit at all will cause it to take a certain definite state. If a qubit takes a definite state, then the superposed coherence of the calculation is lost. Some theorists have deduced that the likelihood of such an event is so great that if nothing were done to correct for decoherence, the largest quantum computer that could be built would have twenty qubits and be capable of doing about twenty thousand gate operations, which would permit no task greater than factoring a four-bit number. The various schemes that are used in classical computers to correct for errors do not work with quantum computers, because these schemes presume that it is possible to find out what the state of the machine is without disturbing a computation.
Applications
One potential application of quantum computers is the sending and decryption of secret messages. A popular form of sending information is public-key cryptography, in which the recipient of the message discloses a public key to anyone who wants to send a message. This key is mathematically related to a private key, which the recipient keeps. Anyone may use the public key to encode a message, but the recipient is the only one who can decode it, unless a codebreaker is able to find a number that is a factor of the public key. On classical computers, this is a hard task. In 1994, it took eight months for a network of 1,600 machines to decode a 129-digit number; decoding a 250-digit number, the key size used for important financial and military transactions, would have taken centuries. There seemed to be no way to attack the problem other than by brute force.
The obvious way to apply quantum parallelism to factoring some large number with N digits would be to try in parallel to divide it by all numbers less than the square root of N. A quantum computer could be programmed to do this, and some of the superposition paths that evolved from the application of gates would indicate solutions to the problem. When the system as a whole was measured to determine the solution, however, it would be very unlikely that this weak signal would emerge from the huge amount of noise generated by the other possible paths of the system's evolution.
Peter Shor, a computer scientist, looked at the problem of finding factors in a new way. Previous work in number theory had indicated that if something about the period of a certain function, which is the number of terms in the sequence until the sequence repeats itself, is known, then factoring is straightforward and can be carried out effectively on a classical computer. In 1994, Shor developed a quantum algorithm for factorizing integers, known as Shor's algorithm, which evaluates the function on a superposition of exponentially many qubits, computes a quantum Fourier transform on the superposition, and then samples the qubits to read off the function's period, in exponentially less time than would be necessary with a classical computer.
The quantum computer has two quantum registers, X and Y, each consisting of a number of qubits initially having value 0. Each individual qubit in the X register is then put into a superposed state. The classical function, represented by a series of logic gates, is applied to the X and Y registers, and the output is stored in X. The Fourier transform, again a series of quantum logic gates, is then applied; it serves to amplify the signal of the correct solution. Finally, a classical measurement is performed on the X register that will, more often than not, yield the solution.
Numerous other quantum algorithms exist to perform various tasks. Another well-known example is Grover's algorithm, developed in 1996 by Lov Grover, which is used for searching an unsorted database. Grover's algorithm is only quadratically faster than its classical counterpart, rather than exponentially faster, but still represents a considerable increase in speed, especially when the number of entries in the database is large.
Other potential applications of quantum computers include any task that involves sorting through large amounts of data or performing massive numbers of calculations, such as validating software, modeling chemical compounds to create new drugs, forecasting weather patterns, or analyzing astronomical data, among many others.
Context
Ever since the invention of the digital computer in the late 1940s, there has been steady progress in the miniaturization of computer parts. Since the upper limit on the speed of signals from one part of the computer to another, the speed of light, was quickly achieved in an approximate way, the only way to make computers work faster was to make parts closer together, reducing the distance information had to travel inside the computer. Scientists realized in the 1960s that if the rate of miniaturization continued, an atomic limit would be reached quite soon. At that size, certain strange things would happen due to quantum-mechanical uncertainty, such as the information in one bit randomly affecting the information in a nearby bit. Early work in quantum computation was merely an attempt to show theoretically that computers built of components that were subject to such quantum-mechanical uncertainties could be operated in such a way that any errors that were introduced could be corrected.
During the 1970s, certain engineering limits were reached in the design of classical computers. Putting too many components too close together or running the processor too fast generated so much heat that errors would result. The solution was to break the problems the computers were trying to solve into separate logical parts, then turn each of them over to a subprocessor that would communicate with the other processors about the progress of the computation. These machines were called supercomputers or parallel processors.
The quantum-computational revolution of the mid-1980s, led by David Deutsch and Richard P. Feynman, was inspired by the idea of the supercomputer to look at the quantum effects on small components in a radical new way. Rather than being nothing but a source of error, perhaps these effects, which were governed by a well-understood theory, could be harnessed to perform useful computational work. Neither Deutsch nor Feynman were computer scientists or engineers; they were theoretical physicists who held an unorthodox view of reality. They both believed that reality had a certain parallel nature. The trajectory that a certain particle would travel, said Feynman, depends on the effects exerted by all the possible paths that the particle could travel.
In the 1990s, the first successful quantum logic gates were constructed by several different teams using different physical methods of implementing the qubits and the operators. A debate began in earnest about whether it would be possible to construct quantum computers consisting of several hundred qubits that could function without error for a reasonable amount of time. Such a machine would have immense practical importance, since it would be able to break the secret codes used in 250-digit public-key cryptography. Moreover, if such a machine functioned, some observers asserted, the fact would be of enormous metaphysical significance as well.
Deutsch has championed a many-world interpretation of quantum mechanics: whenever a quantum system in a superposition of two states makes a transition from one state to another, apparently by chance, the explanation of why that transition occurs is that there is no real transition. The observer sees only one of the two possible outcomes, and some parallel observer, in another universe, sees the other.
In 2011, Canadian company D-Wave Systems announced they had developed the first commercially available quantum computer, the D-Wave One, which ran on an unprecedented 128 qubits. Two years later, they revealed the D-Wave Two, featuring 512 qubits. One major obstacle to developing quantum computers with more than a handful of qubits has been the decoherence problem, which becomes more pronounced as the number of qubits increases. D-Wave sidestepped the problem by implementing an adiabatic model, which operates via quantum annealing rather than the traditional logic gates; as a result, coherence, or the lack thereof, plays less of a role. However, there was some debate over whether D-Wave's products could truly be considered quantum computers. While several tests in early 2013 resulted in some evidence that quantum entanglement was taking place—the defining feature of a quantum computer—the D-Wave Two was unable to perform benchmark tasks as fast as a classical computer system optimized to perform the task in question, even tasks ideally suited to solution by quantum annealing.
While the success of adiabatic quantum computing remained uncertain, the logic-gate model took several small steps forward in 2013. First, teams of physicists at ETH Zürich and the University of Tokyo successfully performed quantum teleportation of information on a computer chip, using quantum entanglement to send the information from one place on the chip to another without it passing through the intervening space—a necessary step in developing solid-state quantum computers, which many consider to hold the most promise for the eventual scaling up of logic-gate quantum computers. Several months later, a team of researchers from Germany, Canada, and the United Kingdom managed to maintain superposed qubits at room temperature for thirty-nine minutes, shattering the previous record of only a few seconds. The qubits in question were represented by the nuclear spin states of ionized phosphorus atoms implanted in solid silicon.
Principal terms
BIT: a classical unit of information that has a value of either 0 or 1
DECOHERENCE: the tendency for superpositions to collapse into some definite state as a result of outside influences
INTERFERENCE: a property of intersecting waves and interacting atomic particles; can be either constructive or destructive
LOGIC GATE: an operation on either bits or qubits to produce a certain output
PUBLIC-KEY CRYPTOGRAPHY: a method of coding information whereby the sender of the coded message does not need to know how to decode it
QUANTUM ENTANGLEMENT: a phenomenon in which particles interact with each other in such a way that their quantum states thereafter remain related, even after the interaction has ceased
QUBIT: a quantum measure of information, realized by a quantum particle in some state, that has a value of 0, 1, or some superposition of the two
SUPERPOSITION: a phenomenon in which a quantum system exists in all possible states until it is observed or measured, at which point it collapses into only one state
UNIVERSAL GATES: logic gates that can, in self-combination, implement any logical operation whatsoever
Bibliography
Altepeter, Joseph B. "A Tale of Two Qubits: How Quantum Computers Work." Ars Technica. Condé Nast, 19 Jan. 2010. Web. 17 Jan. 2014.
Bennett, Charles H. "Quantum Information and Computation." Physics Today Oct. 1995: 24–30. Print. Develops a theory of what quantum information is. Contains a relatively simple discussion of the Shor factorization algorithm and touches on data compression, teleportation, and error correction.
Collins, Graham P. "Quantum Cryptography Defies Eavesdropping." Physics Today Nov. 1992: 21–23. Print. Provides exposition of a method, experimentally tested, that permits the dissemination of a key that guarantees that the intended recepient will know if the key has been intercepted.
DiVincenzo, David P. "Quantum Computation." Science 270.5234 (1995): 255–61. Print. A history of quantum computation that shows how earlier developments paved the way for later work.
Felger, Tim. "The Best Computer in All Possible Worlds." Discover Oct. 1995: 91–99. Print. An introduction to the views and motivations of quantum-computation pioneers David Deutsch and Seth Lloyd. Considers the bearing of quantum computation on the interpretation of quantum mechanics.
Feynman, Richard P. "Quantum Mechanical Computers." Foundations of Physics 16.6 (1986): 507–31. Print.. A historically important paper by the first physicist to suggest a use for quantum computation, namely in simulating quantum systems.
Gershon, Eric. "New Qubit Control Bodes Well for Future of Quantum Computing." YaleNews. Yale U, 11 Jan. 2013. Web. 17 Jan. 2014.
Haroche, Serge, and Jean-Michel Raimond. "Quantum Computing: Dream or Nightmare?" Physics Today Aug. 1996: 51–52. Print. A short summary of arguments against the possibility of practical implementation of quantum computation to solve real-world problems.
Hsu, Jeremy. "D-Wave's Year of Computing Dangerously." IEEE Spectrum. IEEE, 26 Nov. 2013. Web. 16 Jan. 2014.
Hsu, Jeremy. "Quantum Bit Stored for Record 39 Minutes at Room Temperature." IEEE Spectrum. IEEE, 15 Nov. 2013. Web. 17 Jan. 2014.
Hughes, R. I. G. The Structure and Interpretation of Quantum Mechanics. Cambridge: Harvard UP, 1989. Print. A book by a philosopher of science that develops the necessary mathematics for understanding quantum computation, presuming only a knowledge of high-school algebra.
Knapp, Alex. "The First Quantum Teleportation in a Computer Chip." Forbes.com. Forbes.com, 17 Aug. 2013. Web. 21 Jan. 2014.
Lloyd, Seth. "Quantum Mechanical Computers." Scientific American Oct. 1995: 140–45. Print. An excellent nontechnical introduction to quantum computation by a physicist who bridges the gap between the computer-science community and the engineering community.
"New Magnetic Resonance Technique Could Revolutionise Quantum Computing." MIT Technology Review. MIT Tech. Rev., 7 Mar. 2011. Web. 17 Jan. 2014.
Schwartzschild, Bertram. "Labs Demonstrate Gates for Quantum Computation." Physics Today Mar. 1996: 21–23. Print. Surveys the engineering efforts that have produced a variety of different quantum logic gates.
Zurek, Wojciech H. "Decoherence and the Transistion from Quantum to Classical." Physics Today Oct. 1991: 36–44. Gives a variety of different mathematical arguments to explain why superpositions tend to take some definite state in a short amount of time, which provides a limit on effective quantum computation.