The first half of the twentieth century gave rise to two significant accomplishments that have had a profound impact well into the twenty-first century: the development of the quantum theory of matter, and Alan Turing’s foundational work on the universal computer. Quantum theory provided physicists with a deep understanding of matter on the atomic scale and has proven to be one of the most accurate theories ever developed in terms of predicting the behavior of physical systems. Turing’s work laid the groundwork for stored-program computers. The developments of quantum theory led to the invention in 1947 of the transistor, a semiconductor device that could amplify and switch electrical signals. Independently, the theory of digital computing and the implementation of full-scale general-purpose digital computers took shape in the 1950s and 1960s.

In the 1980s, physicist Richard Feynman published several papers in which he examined the ability of current computing technology to simulate physics—in particular, quantum physics. He highlighted the fact that to adequately describe a system of *n* particles, a computer would need to keep track of 2^{n} real numbers, and thus there would be an exponential scaling of the storage requirements. So, for even a small number of particles, say 50, a computer must track 2^{50} (roughly 10^{15}) real numbers. If each number consumes about 64 bits of memory, the overall memory requirement would be on the order of 10^{17} bits, or about 100,000 petabits. That would be a stressing requirement even for current supercomputers. Feynman also discussed the challenges of computational efficiency and how long it might take to complete such calculations. He argued that given the complexity of simulating quantum systems with classical computers, and the inadequacy of current technology and approximations, why not use one quantum system to simulate another? Starting with a quantum system that is well understood and characterized, it could be possible to simulate the behavior and properties of another quantum system that is not so well understood. Thus was born the basic concept of quantum computing.

This early work has spawned the field of quantum information science and technology, which deals with the manipulation, storing, and transmission of information by taking advantage of the quantum mechanical properties of light and matter. One of the key distinctions between quantum and classical information and computation is that quantum information processing deals with direct manipulation of individual quanta (single quantum objects with well-defined quantum states), whereas classical devices rely on the macroscopic behavior of a large number of quanta.

Classical computers use electric voltage levels to represent the logic states of binary digits (bits) and gates that implement Boolean logical operations that transform the bit values (0 and 1) as part of the computation. Early digital computers used vacuum tubes, and then transistors, to create the voltage levels and implement the logic gates. Eventually, these technologies gave way to very-large-scale integrated circuits, in which transistors and other components were directly patterned onto a silicon die along with the electronic pathways. To squeeze more transistors into the same amount of space, engineers successively reduced the size of these circuit elements, with the most recent chips incorporating gate sizes on the order of 30 nanometers (for comparison, the read/write head in a hard drive floats about 10 nanometers above the disk surface, and the size of a silicon atom is on the order of 0.22 nanometers). At some point, it will not be possible to shrink the gate size and increase the packaging density without having to fundamentally change the way these computer chips are designed with respect to the inherent quantum nature of matter (to say nothing about the ability to control the voltages, currents, and heat within these dense structures).

A quantum computer, on the other hand, uses individual quanta and their states as quantum bits, or *qubits*, providing the logical representation of binary information. The physical realization of a qubit is a physical system with two quantum states that can be used to represent the 0 and 1 bit states. Using Dirac notation, these states are represented as |0〉 and |1〉 . Quantum computation can be defined as the application of a unitary transformation on a set of qubits followed by some type of measurement on at least one of the qubits to obtain a classical number. A common model of quantum computation, based on classical computation, is the circuit model. Quantum computations are represented as quantum “logic circuits” whose elements are representations of qubits and quantum gates, including measurement.

The fact that qubits obey the laws of quantum theory and the associated probabilistic interpretation has several important consequences that differentiate quantum and classical computation. Some of the unique aspects that give quantum computing its power include:

**Superposition.**Unlike classical bits, qubits can exist as a superposition of basis states; thus, a qubit |ψ〉 = α|0〉 + β|1〉 can be created, with (α,β) complex numbers called “probability amplitudes,” from the 0/1 basis qubits. A classical bit can only represent a 0 or 1, never any intermediate or superposition value. The ability to create qubits that are a superposition of the basis states is what gives many quantum algorithms their power over their classical counterpart—for example, allowing the simultaneous evaluation of a function over a large number of possible values.**Measurement.**Quantum measurement involves interacting with a qubit in such a way that the state of the qubit will be different after the measurement. Results of quantum measurement are based on expectation values that depend on the square of the probability amplitudes for different quantum states.**Interference.**While waves in classical physics may interfere, this is not a phenomenon that is exploited in classical computing. In quantum computing, however, individual quanta can interact with a relative phase, so the interference among a set of qubits is quite important. Quantum interference results from the relative phase of probability amplitudes and has a direct impact on the performance of two important quantum algorithms, the quantum Fourier transform and quantum search. These quantum algorithms achieve their speed as a result of superposition and interference, such that certain states of interest for the particular problem end up having larger probability amplitudes, within the superposition, providing a quantum parallelism that examines all possible solutions simultaneously.**Entanglement.**This uniquely quantum mechanical phenomenon results in highly nonclassical correlations between qubits. Essentially, when two particles are entangled, the act of measuring one immediately determines the state of the other, regardless of separation distance. This does not violate special relativity because a classical, finite-speed communication channel is required to actually transfer information using entangled states. Consider a simple example of two qubits |A〉 and |B〉. One possible joint state of these two qubits is given by the simple product |A〉|B〉; however, it is also possible to create the following entangled state:It is not possible to write this state as the product of two separate quantum states. In this example, if qubit |A〉 is measured to find state |0〉, then this immediately determines that qubit |B〉 will be observed in the state |0〉.

**No copying of qubits.**Unlike classical bits, it is not possible to create perfect, independent copies of qubits. Creating a copy of a qubit requires knowledge of the complete state of the qubit; however, to obtain such knowledge requires performing measurements on the qubit, destroying its state. This fact has a profound impact on both quantum computing and cryptanalysis.

The realization of a quantum computer will require advances in scientific understanding of how to create and control the quantum state of individual qubits and collections of these qubits. This understanding will have to be translated into the engineering advances needed to design and implement a reliable quantum computer. While there have been several proof-of-principle demonstrations of the basic operation of simple quantum computer circuits using a small number of qubits (fewer than 15), no one currently understands how to realistically scale up to the thousands or millions of qubits that may be necessary for useful computations.

In 2000, David DiVincenzo of IBM published a set of criteria for assessing the viability of any physical implementation for a quantum information processing system(see sidebar, Criteria for a Quantum Computer). It has been challenging to find physical implementations that satisfy all of the criteria. As a result of the interactions of qubits with each other and their environment, their states can change, making them somewhat fragile and prone to errors. Thus, it has been necessary to develop approaches to correct for these errors as much as possible. The qubits in a quantum system will be entangled with one another, and with their environment, which may result in an error in their quantum state that can lead to a failure in the quantum circuit. If the rate at which these errors occur is faster than the time to complete a computation, or if the errors grow exponentially, then the quantum computer will fail. One of the early successes with assessing the feasibility of quantum computing was the discovery by mathematician Peter Shor of specific approaches for quantum error correction.

At present, researchers are exploring a variety of physical implementations for qubits and quantum gates, including trapped ions, superconducting circuits, linear optics/photonics, quantum dots, and nitrogen valence centers in synthetic diamond, to name a few. There is no clear winner, and one should expect that quantum computers of the future would, like classical computers, require a variety of technologies that are each suited for specific purposes.

A quantum algorithm is a mathematical description of how to perform certain computational tasks using quantum resources, such as qubits and quantum gates. There was only moderate interest in quantum computing until Shor published his now famous factoring algorithm in 1994. The ability to efficiently factor large numbers is an important capability that affects many areas of mathematics and cryptography.

For example, computing the prime factors of large numbers (hundreds to thousands of digits long) is extremely difficult. If one wanted to use the most efficient classical algorithm on the fastest supercomputer to factor a 2048-bit number, it would take longer than the age of the universe (classical algorithms are exponential in the size of the input). However, Shor was able to show that because factoring could be reduced to determining the period of a modular function, one could apply the quantum Fourier transform to compute this period, and thus determine the prime factors with polynomial efficiency. The theoretical algorithm requires on the order of *L* qubits, and *L ^{3}* computational time steps, where

Another important quantum algorithm deals with data searching. Given a set of *n* objects, the problem of determining whether *x* is a member of that set will in general require *n* queries. In 1997, Lov Grover proposed a quantum algorithm that could compute the result in only queries.

Another important application of quantum information science involves secure communications. Consider the following scenario in which Alice wants to send a message to Bob without letting Eve (the eavesdropping spy) know the content of the message. Alice and Bob may communicate over an open channel, so it’s possible that Eve could intercept their messages. Alice and Bob therefore need to encrypt their messages before sending.

If they use symmetric key cryptography, they each need to share the same cryptographic key, which must be kept secret and out of the hands of Eve. They must have some method of securely creating, sharing, and safeguarding this key. They then use this key with an encryption algorithm to encrypt their messages before sending them over the open channel. The only classical encryption algorithm that is mathematically proven to be secure is known as the one-time pad, which uses a unique random key for each message that is the same length as the message. Once used, that key is discarded. For short messages, this approach might be feasible, but as the message size increases, it becomes impractical. Another consideration is the practicality of creating and sharing a new key for each message.

Thus, Alice and Bob are faced with the challenge of managing their encryption keys and keeping them out of the hands of Eve. Quantum key distribution might be the solution to their problem. It uses the quantum properties of photons as part of an overall protocol for the secure generation and sharing of cryptographic keys between two parties during a single communication session. Each new session will result in a new, unique key being created and exchanged. For an ideal implementation, the security of this approach rests not on the computational complexity of some mathematical algorithm, but on the physical laws of quantum theory (in particular, the fact that measurement alters the quantum state, and that it is not possible to make a perfect copy of a quantum state).

Several possible protocols have been proposed for quantum key distribution. One such protocol, BB84 (named after its inventors, Bennett and Brassard), uses polarized photons. This protocol involves six basic steps that employ both a quantum channel and a classical communication channel.

Authentication is the first step, in which Alice and Bob must verify that they are in fact communicating with each other and not someone else. This authentication step is a security measure designed to establish the validity of a transmission, message, or originator, or a means of verifying an individual’s authorization to receive specific categories of information. It may be accomplished using classical protocols, and only needs to be done at the beginning of the session.

In the next step, Alice and Bob use the quantum channel to send and receive photons. Alice transmits a stream of photons, each of which is given one of four randomly generated polarization states. The basis of these polarization states can be measured either rectilinearly or diagonally. Bob randomly sets his basis measurement device and measures each photon according to the protocol. On average, Bob will only use the correct basis 50 percent of the time.

Next, Alice and Bob use an open classical communication channel to exchange information regarding the basis used to transmit each photon. When the transmitted and received bases agree, Alice and Bob will retain the corresponding bit, discarding about half the candidate bits on average, as mentioned above. The bit values associated with the retained data comprise the “sifted bits,” which will contain additional sources of error that must be corrected in the next step, known as reconciliation, which also results in a smaller set of bits.

Even though Alice and Bob share an identical set of bits after the sifting and reconciliation steps, an eavesdropper may have gained some information about these bits. Thus, the next step is to generate the secret key through a process known as privacy amplification, which can be described as the art of distilling highly secret shared information from a larger body of shared information that is only partially secret. This will allow Alice and Bob to start with a shared random bit sequence (about which Eve may have some information) and create a shorter shared random key (about which Eve has essentially no information).

Lastly, as part of the session, Alice and Bob can agree to save some of the shared secret bits so that they can be used as part of the authentication step for the next session.

The theoretical BB84 protocol assumes perfect devices, such as single-photon sources and noise-free detectors; however, engineered systems will have to be constructed from realistic, noisy devices and propagate photons through lossy media (optical fiber or Earth’s atmosphere). The impact of these factors on the overall security of any given quantum key distribution implementation is critical to its applicability for national security space. Depending on hardware performance and the effects of the propagation media (and possibly Eve), there can be a significant reduction in the number of usable bits in going from the number of transmitted photons to the final secure key bits. The secure key rate and quantum bit error rate are two important system performance parameters that must be optimized for any implementation.

While quantum information technology appears to offer many potential benefits for certain computational problems and secure communications, it cannot be transitioned into national security space without a detailed assessment of the underlying technologies and system implementations. Also, in evaluating any new technology, it is important for stakeholders to understand both the potential benefits for users as well as the threats that might arise, should an adversary obtain such a capability.

Aerospace has been working to develop a detailed understanding of both the benefits and threats posed by quantum information processing in order to advise customers, provide the necessary subject matter expertise, advance the state of the art (as with any technology relevant to national security space), and support transition planning.

**Potential Benefits**

Space-system development presents problems that are computationally complex, and operational needs may require the acceptance of approximate solutions. Examples include optimizing the design of a satellite constellation to suit a given set of constraints, optimizing the priority tasking of a given set of assets with specified constraints, or fusing data and extracting information from multiple sources. Solving these problems often requires state-of-the-art algorithms running on supercomputers. There is considerable research under way in evaluating those classes of problems for which known or new quantum algorithms may provide a more efficient solution than classical algorithms and computers.

One interesting problem that Aerospace is studying involves possible application of quantum computing to improve how classical programs are compiled and executed on distributed and clustered computers. Software developed for these classical systems must be compiled to run effectively and efficiently. These compilers perform various types of optimization and instruction scheduling based on knowledge of the target hardware; they employ (mostly) heuristics to arrive at a tractable solution within an acceptable period of time. It may be possible to use a quantum computer to find better solutions for the classical compiler optimizations and scheduling to allow improved use of the classical supercomputers.

Finally, returning to Feynman’s original interest in studying quantum computation—the prospect of quantum simulation—offers further potential. Many of the computational problems of interest to national security space involve the development of better materials or a better understanding of material properties and behavior in adverse conditions. This might be an area where quantum simulation could provide advantages over the classical techniques through a more direct and accurate simulation of the physical properties and behaviors of these materials.

**Potential Threats**

Technology is a two-edged sword, and the advantages of quantum computing are tempered by the possible drawbacks. The most obvious threat is that an adversary will apply the resources of a full-scale quantum computer as a cryptanalysis tool. The ability to implement Shor’s factoring algorithm could directly put at risk several classes of encryption algorithms. The application of Grover’s quantum search algorithm provides some, albeit modest, acceleration in brute-force search that could also be applied to cryptanalysis.

Clearly, it would be prudent to start implementing encryption technology that would be immune to attack from a quantum computer; however, adversaries are probably storing information that could be decrypted by a future quantum computing capability, so it is also important to perform a lifetime assessment, delineating the ramifications of having secret information compromised by a quantum computer at a later date.

Many countries have research and development programs in quantum information technology. In Europe, several quantum key distribution implementations have been deployed for secure banking and voting transactions. Japan has announced a metropolitan-scale quantum key distribution network in Tokyo. The European Union has even proposed a ground-to-space demonstration of quantum key distribution, sending a key from a ground station to a receiver on the International Space Station.

Quantum computing and key distribution are important technologies with possibly significant ramifications for future space missions. Researchers at Aerospace have been identifying near-term (5–10 years) and long-term (beyond 10 years) challenges in the area of national security space that might be addressed through quantum computing and quantum key distribution. In addition, Aerospace has been tracking general trends and conducting targeted research in anticipation of increased interest within the space system community.

**Creating Qubits with Ultracold Molecules**

One recent project focused on developing a quantum information processing testbed using ultracold molecules as physical qubits. Rubidium and cesium were selected because the laser cooling of these species is well understood, as are the quantum states of the rubidium cesium molecule (Aerospace has extensive experience in the laser cooling of these atoms as a result of the corporation’s work in atomic clocks, such as those used in navigation satellites). The project demonstrated the formation of ultracold rubidium cesium polar molecules by photoassociation. The researchers developed a practical quantum transition scheme to efficiently produce ultracold rubidium cesium molecules in the lowest quantum states. A carbon dioxide laser was used to trap and store the ultracold atoms and molecules. The next step will be to fully implement the quantum transition scheme and demonstrate qubit operation with ultracold rubidium cesium molecules in an optical trap or lattice.

**Systems Analysis and Engineering of a Quantum Computer**

In 2006, Aerospace researchers completed a study entitled, “The Effects of Quantum Information Technology on the Security of Space Systems.” The study was one of the first to assess the impact of quantum computers on space information security and the possibility of retroactive data decryption.

In 2008, researchers embarked on a project to explore possible quantum computer architectures and components. The research focused on applying a rigorous systems engineering process to the analysis of a quantum computer system to meet user requirements based on a fictitious but representative cryptanalysis mission. It examined how top-level system requirements, based on user needs, affected the subsystem-level requirements and how these compared with current and projected technology capabilities. This work also explored how classical concepts of reliability and fault-tolerance could improve the design of a quantum computer system. It also developed a quantum programming language (similar to high-level classical programming languages), a quantum computer compiler, and associated analysis tools to estimate the resource requirements (physical qubits and gates) needed to achieve reliable computation.

More recently, the researchers began a project to improve the tools and methods for evaluating quantum computer system designs. The work involves extending the Aerospace quantum computer compiler into a complete quantum computer design and analysis toolbox that will enable better prediction of the resources and overhead needed to support the necessary quantum error correction and control protocols. These tools will provide a foundation for the development and evaluation of improved error correction and control protocols that will help minimize resource overhead in realistic quantum computers.

The next step will be to extend the Aerospace quantum computer compiler to include other models of quantum computation, additional quantum error correction and control protocols, and additional compiler passes to optimize the quantum assembly code and minimize the number of steps required to implement the quantum program. Using the quantum computer design and analysis toolbox, researchers will be able to analyze the performance of a given quantum program as a function of the quantum error correction and control protocols and resource overhead. The compiler backend code generation (quantum instruction-set architecture) will be extended to incorporate a well-defined interface to accommodate other quantum instruction-set architectures based on different physical quantum computer architectures.

Finally, it will be necessary to develop methods to verify and validate the quantum computer designs as well as the analysis toolbox software and resource estimates. The correctness of these methods must be assessed before the results can be applied to more complex quantum computer system designs.

**Quantum Key Distribution Test and Evaluation Facility**

Another project is focused on developing a complete test and evaluation facility to assess the information assurance aspects of specific quantum key distribution implementations. The facility will allow assessment of both security and system performance in terms of secure key rate and quantum bit-error rate. Researchers will assess the security of specific hardware implementations of quantum key distribution protocols, including their confidentiality, integrity, and availability. The impact of side-channel attacks will also be evaluated. The facility will also provide quantitative data for the development of performance models that can be used to assess the potential for long-distance (ground-to-space) quantum key distribution.

The technology development and field-testing of quantum key distribution has progressed faster than that of quantum computing. Many of the technical challenges for scaling up quantum key distribution are reasonably well understood; however, the task of designing and implementing a secure quantum key distribution system and evaluating its security in an operational context is extremely challenging and could take many years. During this time, it will be necessary to also develop a better understanding of where, and how, to employ quantum key distribution systems to enhance space mission assurance.

Early computer systems from the late 1950s up to the 1980s were dominated by large mainframe systems and minicomputers. The early mainframes required a full-time staff to operate and maintain them. Users did not typically interact directly with these systems, but rather submitted their jobs—manually at first, and eventually through automated job submission and scheduling tools for batch processing. Given the current rate of technological progress in quantum computing, early quantum computers may employ relatively few qubits—on the order of tens to a few hundred. Programming these systems and maintaining availability will be challenging. These early implementations will provide fertile ground for experimentation with scaling these systems to large numbers of qubits and gates (on the order of millions). These large-scale quantum computers may resemble the early mainframe systems and look like large-scale quantum physics experiments in which a quantum core will be controlled by a complex classical computer network. Users will not interact directly with quantum computers, but will rely on an intermediary classical system to load, execute, and interpret quantum programs. When the job is complete, the user would receive a message with a link to the results. Other configurations may also be implemented—not as general-purpose quantum computers, but as special-purpose systems to solve specific problems. Initially, users will need a solid understanding of quantum information, quantum computing, and quantum programming to use these systems (there may be an analogy to the early machine language programming of classical computers, compared to the ubiquitous high-level programming languages seen today). Such systems will also require a full-time staff to maintain.

As small quantum computers become a reality, one might expect the field of quantum computer science to expand at an increasing rate due to the availability of actual hardware on which to experiment and test new ideas. Understanding quantum information science requires some degree of expertise in a variety of fields, including physics and computer science. Many universities have interdisciplinary programs in quantum information science, training the future generation of quantum computer scientists. These new researchers may uncover additional quantum algorithms and expose as yet additional undiscovered power for quantum computing.

As scientific knowledge of how to control and manipulate quantum systems improves and the implementation technologies are refined, one might expect that these quantum computer systems might someday follow an evolutionary path similar to that of the classical computer, in which engineers develop the techniques to create highly integrated quantum circuits that result in relatively compact physical implementations of quantum computers, ultimately resulting in a quantum computer in space.

Quantum information is a multidisciplinary field requiring skills from physics, mathematics, computing science, and engineering. The author would like to acknowledge discussions and collaboration with Tzvetan Metodi, Leo Marcus, He Wang (primary investigator for the ultracold molecule qubit research), and Benjamin Bowes (primary investigator for the quantum key distribution test and evaluation facility).

- D. Bacon and W. van Dam, “Recent Progress in Quantum Algorithms: What Quantum Algorithms Outperform Classical Computation and How Do They Do It?”
*Communications of the ACM,*Vol. 53, No. 2, pp. 84–93 (2010). - D. DiVincenzo, “The Physical Implementation of Quantum Computation,”
*Fortschritte der Physik,*Vol. 48, No. 9–11, pp. 771–783 (2000). - R. Feynman, “Quantum Mechanical Computers,”
*Foundations of Physics,*Vol. 16, No. 507–531 (1986). - R. Feynman, “Simulating Physics with Computers,”
*International Journal of Theoretical Physics,*Vol. 21, No. 6/7, pp. 467–488 (1982). - M. Nielsen and I. Chuang,
*Quantum Computation and*(Cambridge University Press, 2000).

Quantum Information - J. Nordholt and R. Hughes, “A New Face for Cryptography,”
*Los Alamos Science,*Vol. 27, pp. 69–85 (2002). - P. Shor, “Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer,”
*SIAM Journal of Scientific Statistics and Computing,*Vol. 26, p. 1484 (1997). - “Ultracold Molecules,”
*Crosslink*, Vol. 6, No. 1, p. 31.

Back to the Spring 2011 Table of Contents

To sidebar: Criteria for a Scaleable Physical Technology to Implement a Quantum Computer

]]>