21 July 2005

New Hurdle For Quantum Computing

By Rusty Rockets

The development of a practical quantum computer has always looked a little uncertain (no pun intended). The problems stem from the incompatible relationship between classical and quantum mechanics. Quantum mechanical systems behave differently to larger bodies as they can exist simultaneously in two different states. This is the property that gives quantum computers their edge over conventional computer systems, but a recent study shows that the quantum states required for quantum computing may be too unstable for any practical use. In an article entitled 'An Intrinsic Limit to Quantum Coherence due to Spontaneous Symmetry Breaking', recently published in the journal Physical Review, Dutch theoretical physicists Jasper van Wezel, Jeroen van den Brink and Jan Zaanen, from the Foundation for Fundamental Research on Matter (FOM) and Leiden University, claim that the instability inherent in quantum bits could lead to information stored in them to spontaneously disappear.

A quantum computer gets its power by taking advantage of the properties of atoms or nuclei that allow them to work together as quantum bits, or qubits, which serve simultaneously as the computer's processor and memory. By directing the interactions between qubits, theory suggests that a quantum computer should perform certain calculations, such as factoring, exponentially faster than conventional computers.

But qubits can become decoherent when they come into contact with the outside world. The unique property of a quantum object jumping between two discrete points is said to disappear once its quantum state is measured. Measuring a qubit's state causes its state to become fixed, and it loses its coherence in the process. When this happens, any useful information stored in the qubit is lost, and without this information the quantum computer becomes useless. The trick, then, is to somehow insulate the qubits from the outside environment. But this is problematic. If quantum computers are to have any utility at all, we have to able to be able to access and input information.

The task has not been made any easier thanks to van den Brink and his team's unexpected discovery. "We show that many-particles qubits always decohere, even when they are completely isolated from the outside world. That was a surprise for ourselves and for the rest of the scientific community alike," van den Brink told Science a GoGo. The researchers discovered that the coherence tends to spontaneously disappear, even without external influences. The decoherence process is a result of quantum mechanical spontaneous symmetry breaking. In classical physics this is somewhat analogous to spontaneous crystallization in a solution. When a crystal spontaneously forms at a certain position in a solution the fluid structure is broken. The researchers believe that the coherence in some highly promising concepts for qubits will disappear after about a second. Moreover, they say the smaller the qubits, the faster that process occurs. All of this would seem to pose a fundamental limitation on the development of quantum computers.

However, it seems that decoherence in itself is not necessarily the death knell for quantum computing. The most successful and complex quantum calculation yet accomplished used a technique involving nuclear magnetic resonance, conceived in the mid 1990s by Isaac Chuang and Neil Gershenfeld of MIT. IBM chemists in collaboration with Chuang and Gershenfeld then designed and made a new molecule that had seven nuclear spins - the nuclei of five fluorine and two carbon atoms - which can interact with each other as qubits. These were then programmed by radio frequency pulses and detected by nuclear magnetic resonance (NMR) instruments. The IBM scientists controlled a vial of a billion billion (1018) of these molecules so that they executed an algorithm and correctly identified 3 and 5 as the factors of 15. It may not sound very impressive, but it does demonstrate how a quantum computer works.

Like van den Brink, Chuang and Gershenfeld also noted the limiting effects of decoherence on quantum computing. "Rotating nuclei in a fluid will, like synchronized swimmers robbed of proper cues, begin to lose coherence after an interval of a few seconds to a few minutes. The longest coherence times for fluids, compared with the characteristic cycle times, suggest that about 1,000 operations could be performed while still preserving quantum coherence," they claimed. They also said error correction could be used to increase this limit. "Fortunately, it is possible to extend this limit by adding extra qubits to correct for quantum errors." Conventional computers use extra bits to accommodate errors and it would seem the same can be done quantum-mechanically.

Van den Brink agrees with this method, saying that: "One can fight the finite coherence time can by quantum error correction procedures." Van den Brink says that his team's work is only concerned with the phenomenon of decoherence occurring irrespective of any measures taken to prevent contact with outside influences; not that quantum errors are unsolvable.

Inspired by their initial NMR based quantum computations, and the possibility of robust error correction, Chuang claimed that: "If we could perform this calculation at much larger scales - say the thousands of qubits required to factor very large numbers - fundamental changes would be needed in cryptography implementations." But it turns out that this is the biggest stumbling block for NMR quantum computing. "The problem with NMR is that one cannot build a scalable quantum computer with it. This means one can make maybe four or five qubits and do a computation with those bits, but not many more. So an NMR quantum computer will never do any practically interesting computation," said van den Brink.

This perhaps is why van den Brink's team has focused on larger forms of qubits that still possess intrinsic quantum properties, rather than single particles. "The qubits that we consider, do not consist of single particles [or single spins of electrons or nuclei], but of many particles in a quantum mechanical superposition. So our qubit is more like Schroedingers cat. A cat consists of very, very many particles, so one might call the qubits that we consider 'Schroedinger kitten' qubits," van den Brink said. It is these nano-mechanical qubits that could form the basis of practical quantum computers and communications devices. One recent example of such a component is the nano-mechanical oscillator developed by physicists at Boston University. The device is the fastest of its kind, oscillating at 1.49 gigahertz, or 1.49 billion times a second. The quantum effect is evident when the nanomechanical oscillator starts to jump between two discrete positions without occupying the physical space in between.

It's clear that many of the conundrums surrounding quantum states are related to measurement. But just what really does the measurement of qubits entail? The question: "how does 'measuring' a qubit instantly fix its state?" together with: "What form does this measuring take?" are, as it turns out, two of the most important questions in physics today, according to van den Brink. "Precisely this problem is the one that we are really interested in: it was the motivation for our research project!" van den Brink tells us. The problem is called the measurement problem - or more formally known as 'the collapse of the wave function'. A quantum measurement device cannot measure quantum objects, as "this machine is also subjected to the laws of quantum physics. After all, it is made of microscopic stuff similar to the small quantum system on which the machine acts," van den Brink said.

"The essence of it was already formulated by Bohr in more or less the following way. What do we mean by a�measurement? With this we mean that we connect a [small] quantum mechanical system to a [large] classical measurement machine. It is essential that one system is quantum and the other one classical. Otherwise, by connecting a quantum system to a quantum system, we can never perform a measurement: if we measure a voltage with a voltmeter, the voltmeter is a classical object, its needle pointing in some definite direction: the voltmeter is not allowed to be in a superposition: its needle cannot point towards two directions at the same time. If we would allow for this possibility, this wouldn't correspond to a measurement as we need one number, one voltage, to be the result of the measurement. In other words, in order to understand the measurement process we need something more than quantum mechanics. The thing we need more is classical mechanics! This situation is intellectually very unsatisfactory, but this is the way it is," van den Brink explained.

So there still seems a long way to go on the quantum computing front. It's likely that larger quantum components, like the quantum oscillator, will need to be measured for various reasons, so if better systems of classical measurement are not devised it is likely that coherence problems initiated by the act of measurement are set to plague quantum computing. Even if the "measurement problem" is solved the "natural" decoherence discovered by van den Brink predicts that quantum bits will decohere anyway, so a method of error correction will need to be found that will have to be more sophisticated than merely pumping extra atoms into the mix.

References:
Chuang and Gershenfeld's paper
Van den Brink, et al paper