Visit Battelle
Home | This Week's Contents  |  ACS Job BankSearch C&EN Online

 
Millennium Special Report
C&EN 75th Anniversary Issue
 
 
Related Stories
Quantum Dot Lasers Coming Soon
[C&EN, Oct. 23, 2000]
Theoretical Eye On Materials
[C&EN, Sept. 27, 1999]
Quantum Dot Meets Biomolecules
[C&EN, Sept, 28, 1998]
Related Sites
IBM Thomas J. Watson Research Center
Argonne National Laboratory
National Institute of Standards & Technology (NIST)
Los Alamos National Laboratory
Massachusetts Institute of Technology
IBM Almaden Research Center
E-mail this article to a friend
Print this article
E-mail the editor
 
 
 
 
 
 
 Table of Contents
 ACS Job Bank
 News of the Week
 Cover Story
 Editor's Page
 Business
 Government & Policy
 Science/Technology
 Education
 ACS News
 Calendars
 Books
 Software/Online Briefs
 ACS Comments
 Career & Employment
 Special Reports
 Letters
 New Product
 Meetings
 People
 Newscripts
 ACS Elections
 Nanotechnology
 What's That Stuff?

 Hot Articles
 Safety  Letters
 Chemcyclopedia

 Back Issues

 How to Subscribe
 Subscription Changes
 Electronic Reader Service
 About C&EN
 E-mail webmaster
SCIENCE/TECHNOLOGY
November 6, 2000
Volume 78, Number 45
CENEAR 78 45 pp.35-39
ISSN 0009-2347
[Previous Story] [Next Story]

Physicist Eugene Wigner's representation of quantum superposition state (the two lumps) showing interference fringes in the center. Image also corresponds to a qubit in a superposition state of 0 and 1. [Image courtesy of Isaac Chuang and Wojiech Zurek]

Elizabeth K. Wilson
C&EN West Coast News Bureau

The acceleration of computing power never seems to slow. With each blink of an eye, there's a new, faster processor or a more data-storage-intensive hard drive. But for all their computational might, computers as we know them will eventually bump up against the laws of physics.

Technology marches on resolutely, shrinking electronic components and cramming more circuitry onto smaller and smaller wafers of silicon. If the current rate of miniaturization continues, computer experts predict that within a decade or two, transistors will dwindle to the size of an atom. But at those dimensions, well-behaved, predictable classical behavior goes out the window, and the slippery, untenable nature of quantum mechanics takes over. In the quantum world, rather than being entities with sharply defined positions and motions, particles are described by spread-out wavefunctions, seemingly existing in many places at once.

So it might seem that the power of computers is destined to reach a limit. But scientists usually don't take such pronouncements at face value--in this case, they have long been aware of a way around this apparent constraint. For within the shadowy quantum netherworld is more potential computing power than the speediest processor could ever dream of. That power stems from quantum particles' penchant for existing in more than one state, as well as their ability to become inextricably linked to each other by a phenomenon known as entanglement.

Traditional computers perform calculations, however quickly, in a basically sequential manner. Their limitations surface in the simple, yet striking, example of factoring a large number. The time a computer spends searching for a number's factors increases astronomically with the size of the number. To factor a 400-digit number, for example, would take a modern computer billions of years.

On the other hand, a computer made of quantum particles has a built-in parallelism because quantum calculations can be performed on the particles' coexisting states simultaneously. A quantum computer, then, might factor that 400-digit number in minutes. Such a completely different approach to computing, it seems, truly earns the designation "paradigm shift."

Quantum computers were once a fanciful concept entertained only by the brainiest of scientists. In the early 1980s, the late, great physicist Richard P. Feynman proposed that, since classical computers did a miserably clunky job of modeling the complex dynamics of quantum systems, perhaps a computer that operated according to quantum principles might be the answer. Other theorists, such as Charles H. Bennett at IBM Thomas J. Watson Research Center in Yorktown Heights, N.Y., and Paul A. Benioff at Argonne National Laboratory , were also toying with the idea, publishing groundwork-laying papers that showed, among other things, how quantum particles might function as computer bits. In 1985, physicist David Deutsch at Oxford University showed that quantum computers could, at least in principle, model any physical system.

But the real icebreaker came in 1994, when a theorist named Peter W. Shor , AT&T Labs-Research, Florham Park, N.J., developed a provocative algorithm that outlined exactly how a quantum computer could indeed factor a huge number exponentially faster than a classical computer.

Quantum computing research team at UC Berkeley: (from left) Whaley and graduate students David Bacon, Simon Mygren, Ken Brown, and Julia Kenkpe. [Photo by Peg Skorpinski]
Factoring big numbers might sound like a mundane reason to create a quantum computer, but it's not. Factoring is the basis of encryption systems that keep activities like banking transactions private. In these systems, the factors of large numbers are the keys to deciphering encrypted messages, and it would take a hacker's computer far too long to determine these factors. But if someone built a big enough quantum computer, national security would be in danger of crumbling. So, not surprisingly, the U.S. military--notably the Defense Advanced Research Projects Agency --immediately pricked up its ears and began funding quantum information research.

A desktop quantum computer is still decades away, but in just a few short years, quantum computing has developed from an ongoing discussion on paper to include a thriving experimental effort that's already produced a few rudimentary quantum computers. Numerous top-flight research institutions now have intellectual hives devoted to quantum computing, including Oxford University; the University of Innsbruck, Austria; the Boulder, Colorado, labs of the National Institute of Standards & Technology (NIST) ; Los Alamos National Laboratory ; Massachusetts Institute of Technology ; California Institute of Technology; and Stanford University. Even Microsoft Research in Redmond, Wash., now counts a quantum computer scientist among its theorists.

Leading journals like Nature, Science, Physical Review, and Physical Review Letters are awash with papers dealing with all aspects of quantum computing and information. The number of such papers has more than tripled since 1998.

And it is not just a game for mathematicians and physicists anymore. They are now being joined by materials scientists and chemists. In fact, quantum computer researchers courted chemists last August in Washington, D.C., with a large symposium at the national meeting of the American Chemical Society. Coorganizers K. Birgitta Whaley, a chemistry professor at the University of California, Berkeley, and Isaac Chuang, a researcher at IBM Almaden Research Center in San Jose, Calif., say they need chemists' unique expertise in areas that are rapidly becoming important to the field: nuclear magnetic resonance, quantum dots, quantum dynamics, and coherent control. The effort was "sort of a community outreach," Whaley says.

The basic unit of computation is a bit, represented as either a 1 or a 0. In the solid-state world of classical computers, the bit's value corresponds to the presence or absence of current. Bits can be manipulated by what are known as logic gates, which transform bit values in designed ways. For example, a NOT gate changes a bit from 0 to 1, or vice versa. In fact, everything a computer does, from word processing to modeling the structure of the universe, can be boiled down to various combinations of these simple logic gates operating on bits.

A quantum computer shares some of these traits. Many quantum-sized systems or particles have two possible configurations that could correspond to the 0 or 1 of a bit: the up or down spin of an electron or a nucleus, for example, or the polarization of a photon. But here, the quantum system departs from its classical cousin. A quantum bit, otherwise known as a qubit, can be set up to exist in both the 0 and 1 states at the same time. According to rules of quantum mechanics, the qubit will stay in this superposition of values until it is measured, at which point it "collapses" into a definite value.

Another quantum trick is entanglement. If two particles are prepared, or set up, together in the right way, their wavefunctions become a combined system and can't be factored into separate contributions. The implication that follows may seem bizarre: If you measure some aspect of one particle--polarization, for example--then its superposition of values collapses into one definite value, as expected. But the companion particle then also instantaneously assumes the same value, no matter how far apart the particles might be. This so-called action at a distance perplexed even Albert Einstein, yet it has been shown experimentally to occur.

If a whole array of particles are set up in such superposition states, the result is a quantum computer register: a separate series of 0s and 1s for each combination of superpositions. In a simple example, a system of two qubits has four possible configurations: (00), (01), (10), and (11). Subsequent logic operations on the qubits could then operate on all of the different states at once.

It becomes easy to see that if each state represents, say, names or phone numbers, you could search all entries at once and come up with a solution in far fewer steps than a classical computer could. And the more qubits it has, the more a quantum computer outpaces the classical computer. To get an idea of what a colossal amount of effort it could save, consider that a quantum computer with 100 qubits will have 1029 simultaneous states.

But practical quantum computers come with considerable baggage. First, how do you get your answer? After all, if you measure the system, it collapses into a single value that's not necessarily the right answer. Researchers have found ways to get around the problem. For example, Lov K. Grover , a theorist at Lucent Technologies' Bell Labs in Murray Hill, N.J., has developed an algorithm that manipulates the system with a series of logic gates that increases the system's wavefunction at the value you want, thereby eventually increasing--to 1, or certainty--the probability of getting that result when you measure the system.

But even with a workable strategy for obtaining an answer, you still must contend with a more insidious problem known as decoherence, which constantly threatens to interfere with quantum computation. Because, unfortunately, it is not just a purposeful measurement that can collapse a delicate quantum system. Absolutely any sort of interaction with the environment, be it a stray photon or vibration, will do it.

Protecting a quantum system is extremely difficult. A whole subfield known as quantum error correction has sprung up to deal with the problem. And researchers have developed different strategies to protect quantum systems from decoherence and to repair damage that does occur.

About five years ago, Shor and theorist Andrew M. Steane at Oxford independently discovered that the approach to active error correction used in classical computers could also be used in quantum computers. A classical setup consists of several redundant bits for each piece of information, which the computer repeatedly checks. If one redundant bit's value differs with respect to the others, the computer changes it back. A system of extra, or ancillary qubits, could do the same thing for a quantum computer. Various combinations of these qubits can be checked--without measuring their individual values--to see if they differ. Such a comparison won't collapse the system, because technically, nothing has been "observed." But one can infer whether an error has occurred.

Yet another approach, which does not rely on ancillary bits, is the theory of decoherence-free subspaces, which leads to an alternative passive form of error correction. What this means in plain English is that, in many cases, errors are well characterized and can be anticipated, so systems could be designed with that in mind. Research by Whaley; Chuang; Daniel Lidar, an assistant professor of theoretical chemical physics at the University of Toronto; and their colleagues shows that physical systems can sometimes be put into special quantum superposition states designed so that they simply do not interact with certain environmental contaminants. Such a situation arises in a solid-state system at low temperatures when qubits are placed closer together than the typical wavelength of phonon modes.

Four trapped beryllium ions, showing an expanded view of NIST's trap. [NIST photo]
Recently, physicist Paul G. Kwiat at Los Alamos and colleagues obtained the first experimental evidence that physical systems can be created in such decoherence-free subspaces [Science, 290, 498 (2000)].

Another variant of error correction comes from Princeton University chemistry professor Warren S. Warren and former graduate student Jeff P. Barnes. In their approach, they use the natural dissipation of energy in a system to steer the system back to where it should be [Phys. Rev. Lett., 85, 856 (2000)]. As Barnes explains, if, for example, a photon interacts with a system and nudges it away from the state it should be in, the system responds by dissipating more energy, which in turn causes the system to evolve back toward its original value.

In choosing the sort of quantum system that would make a good computer, scientists face a bit of a catch-22 situation, Whaley says. A quantum system needs to last for a reasonably long time before it starts to become decoherent. In other words, it needs to interact weakly with its environment. But if that's the case, it generally also will interact weakly with itself, making entanglement generation and quantum computation difficult. "The golden egg everyone's looking for would be a system that has a weak interaction with the environment and a strong internal interaction," she says.

One of the first descriptions of a real, physical quantum computer came in 1995 from Innsbruck theoretical physicists J. Ignacio Cirac and Peter Zoller. They proposed that a quantum computer could be built out of trapped ions. Cleverly suspended in an electromagnetic corral, these ions would form a row, able to move only in the direction of the row. The ions would be supercooled to their ground states. The ions' ground and excited states would act as the 0 and 1 of a qubit. And the ions' mutual coulombic repulsions and vibrational interactions would mediate entanglement. Pulses from ultrafocused lasers--one for each atom--would be used to put the qubits in their superposition states and entangle the qubits.

A big advantage of this method is that the trap isolates the ions quite well from their environment, yet the particles still interact well with each other.

Physicists Christopher Monroe and David Wineland and their colleagues at NIST in Boulder put Zoller and Cirac's idea to the test, building a system based on a beryllium ion. In this system, the ion contained both qubits: its ground and excited electronic and vibrational states. With the system, they demonstrated a quantum logic gate.

Recently Monroe, who is now an associate physics professor at the University of Michigan, Ann Arbor, and his NIST colleagues used their ion-trap technology to entangle four beryllium ions predictably and reproducibly--a method that should be applicable to any number of ions [Nature, 404, 256, (2000)]. This result is important, because with previous methods--such as preparing atoms in a thermal beam--producing entangled particles is a chancy operation, and success becomes increasingly unlikely as the number of particles increases.

Graduate student Holly Cummins (front) and Jones (back) in the NMR lab at Oxford University.
However, as quantum computing technology goes, an ion trap is an elaborate and tricky device. As Oxford physical chemist Jonathan A. Jones notes, "An awful lot of effort needs to be spent to keep the ions still before you can start manipulating them." Nevertheless, the technology holds enough promise that a number of groups, including those at Oxford and Los Alamos, are now studying trapped ions.

But by far the most well-known method for quantum computing right now is based on nuclear magnetic resonance (NMR).

"The advantage of NMR over other technologies is quite simple: It works," Jones says. As he points out, NMR quantum computers are the only ones to have yet actually performed a quantum algorithm.

In 1996, MIT associate nuclear engineering professor David G. Cory and colleagues first proposed the notion that NMR could be used to manipulate and read qubits in the form of nuclear spins. At about the same time, IBM's Chuang and MIT computer science professor Neil Gershenfeld also stumbled onto the idea.

When nuclei with spin are placed in a magnetic field, they align themselves either parallel or antiparallel to the field, with a slight excess in the parallel orientation. A pulse of electromagnetic radiation of the right frequency and duration will flip a nucleus. In this way, nuclear spin serves as a qubit.

The qubits can be put into both parallel and antiparallel alignments at the same time by giving them a pulse of energy that's too short to cause nuclei to flip completely. A classical analogy is that of a spinning top that tips at a 90 angle to the magnetic field. The sideways nucleus then precesses around a vertical axis.

In one experimental setup of two qubits, for example, one nucleus aligned parallel to the magnetic field is put into a superpositioned state. But its precession will be affected by a second nearby nucleus. If the second nucleus' spin is up--that is, parallel to the magnetic field--it adds energy to the first nucleus and causes it to precess a bit faster. If the second nucleus' spin is down--or antiparallel to the magnetic field--it slows the precession of the first. This interaction affects the direction to which the precessing nucleus will be pointing at any given time. If another pulse of energy is given to the system at the right instant, depending on where its precessing spin is pointed, that first nucleus will then flip up or down. This two-qubit operation is one of the fundamental logic gates, known as a controlled-NOT gate, that a computer uses. The resulting NMR spectrum reveals which direction the nucleus flipped.

Chuang and physicist Costantino Yannoni performing an NMR quantum-computing experiment at IBM Almaden. [IBM-Almaden photo]
What's unique about NMR is that each molecule is itself a quantum computer. So a small vial of solution contains trillions of computers. A number of them can be lost to decoherence without immediately affecting the whole vial. And NMR quantum computer decoherence times are very long to begin with--up to seconds--which allows ample time to perform calculations before the system collapses completely.

The first quantum computer of MIT's Cory and colleagues consisted of two hydrogen qubits in the molecule 2,3-dibromothiophene, on which they demonstrated different logic gates. Since then, the NMR results have been coming in thick and fast. In 1998, Chuang and colleagues also created a two-qubit quantum computer out of chloroform, this time using a carbon atom and a hydrogen atom as the qubits. They used it to perform two different algorithms. Jones and his colleagues at Oxford ran the same algorithms using two hydrogen qubits in cytosine.

Also in 1998, Cory, Los Alamos physicist Raymond Laflamme , and their colleagues used three qubits in trichloroethylene to run part of the error-correction algorithm of Shor and Steane.

Just this year, biological chemistry and molecular pharmacology professor Amr F. Fahmy at Harvard Medical School in Boston, organic chemistry and biochemistry professor Steffen J. Glaser at the University of Munich, and their colleagues built a five-qubit quantum computer out of a glycine fluoride derivative, which they used to run an algorithm that distinguishes one class of mathematical functions from another [Phys. Rev. A, 62, 012310-1-8 (2000)]. And Chuang and colleagues demonstrated a relatively complex, "order-finding" search algorithm with a five-qubit quantum computer based on an iron complex ( http://xxx.lanl.gov/abs/quant-ph/0007017 ).

At Los Alamos, Laflamme, computer scientist Emanuel Knill, and colleagues created a seven-qubit system usingtrans -crotonic acid [Nature, 404, 368 (2000)].

However, most in the field agree that NMR's future in quantum computing is limited. It is enjoying the spotlight now, but the technology suffers from several potentially fatal limitations. One of the most serious is lack of what is known as scalability. As the number of nuclear spins in a molecule increases, the number of molecules in the correct starting state decreases exponentially. So by the time a system gets larger than about 10 qubits, any signals are so weak that ordinary noise drowns them out.

Nevertheless, NMR remains an extremely valuable system. "NMR is very important for benchmark tests on small numbers of quantum bits," Whaley says.

So what is the quantum computing system of the future? "We have no idea what the ultimate physical implementation of a useful quantum computer will be, but there are hints," Chuang says. "It will probably use some solid-state technology, and it will look much like NMR, with complicated pulse sequences and spin systems. But the pulses might be at optical frequencies, and the spins might be dopants in a crystal, instead of liquids."

Quantum computing research team at the University of Maryland's Laboratory for Physical Sciences: (from left) graduate student Matthew LaHaye, Kane, graduate student Michael King, physicist Marc Manheimer, graduate student Dan Sullivan, physicist Keith Schwab, and graduate student Kenton Brown.
Because of that promise of a solid-state technology, Whaley and Chuang brought to the ACS symposium a number of leading materials scientists, such as researchers from the lab of UC Santa Barbara physics professor David Awschalom; UC Berkeley chemistry professor A. Paul Alivisatos; and UC Los Angeles chemistry professor James R. Heath.

Their work could benefit the ideas of researchers like David DiVincenzo, a physicist at IBM Watson, who proposes creating a quantum computer from an array of quantum dots. Quantum dots are small, isolated collections of atoms that behave much like a single atom, sharing electrons and a common wavefunction. A quantum dot on the scale of about 40 nm, and with a single excess electron, could serve as a qubit, with the electron's up and down spin serving as the 0 and 1. Such a technological feat is still in the future, DiVincenzo says. In particular, a key and elusive goal is to measure an individual spin state, something some labs are beginning to experiment with.

Bruce E. Kane, physics professor at the University of Maryland, envisions a method that takes advantage of both nuclear and electronic spins. In his scenario, single 31P atoms embedded in silicon in a precisely spaced array serve as qubits. The nuclear spin of the phosphorus acts as the original qubit, but the information is transferred to the phosphorus' unpaired electrons. "An electron spin is a very attractive qubit," Kane says. "It's a lot easier to manipulate than a nucleus."

Silicon is an appealing material in several ways. Both its electron and nuclear spins have extremely long decoherence times. It is already the seminal material of computer processors. Research momentum is thus already directed toward silicon. One technological challenge, however, will be ensuring the purity of the silicon, because even a single unwanted atom could cripple the system. "Fluctuation or variation is a killer of quantum computing," Kane says.

Silicon-based quantum computers are the focus of a mammoth research effort by a consortium including several universities in Australia and the U.S., as well as Los Alamos, leading those in the field to dub it the "Manhattan Project of quantum computing."

Once powerful enough quantum computers do become practical, what could they be used for, beside factoring? If one remembers Feynman's prophecy that a quantum system would be needed to model quantum systems, it becomes clear that the host of scientific problems involving quantum dynamics might benefit from such technology, Whaley says. For example, MIT assistant mechanical engineering professor Seth Lloyd and colleagues recently developed a quantum algorithm that performs a key component of electronic structure calculations and other computational chemistry problems. "There's lots of very interesting physics and chemistry currently beyond our reach with conventional computation power that can be done with quantum algorithms," Whaley adds.

[Previous Story] [Next Story]



Top


Chemical & Engineering News
Copyright © 2000 American Chemical Society


Home | Table of Contents | News of the Week | Cover Story
Business | Government & Policy | Science/Technology
Chemical & Engineering News
Copyright © 2000 American Chemical Society - All Right Reserved
1155 16th Street NW • Washington DC 20036 • (202) 872-4600 • (800) 227-5558


CASChemPortChemCenterPubs Page