If, as Nobel Prize–winning physicist Richard Feynman once said, no one really understands quantum mechanics, then you can appreciate the dilemma that faced Geordie Rose earlier this year as he stood at a podium in front of a room packed with journalists, skeptics, and potential investors, deep in the heart of Silicon Valley. As the chief technology officer and co-founder of D-Wave Systems, a Burnaby-based tech start-up that spun out of the University of British Columbia in 1999, Rose had the daunting task of explaining his company’s breakthrough, billed as “the world’s first commercial quantum computer.” With dark eyebrows looming over a bulldog face, and his powerful athlete’s body dwarfed by a pair of giant screens flanking the podium, he faintly evoked Richard Nixon wilting under the bright lights of the Kennedy debates.
Building a quantum computer—a computer, that is, that harnesses the extraordinarily strange laws of quantum mechanics, which come into play at a subatomic level—has been one of the foremost goals of the scientific world for more than a decade. It would be a fundamentally transformative machine, capable of modelling and predicting the behaviour of almost anything in the universe. Most scientists believe it won’t be possible to build one for many decades, if ever. Rose begged to differ.
On the twin screens behind him, he called up a giant Sudoku puzzle—“a whimsical example,” he acknowledged with a smile. The audience watched as the blanks in the Sudoku were filled in on the screen via a remote connection to the prototype quantum computer, which was back on Burnaby Mountain, housed in a protective copper-walled box at −273°C, a hundredth of a degree above absolute zero. While solving a Sudoku is no great accomplishment (for a computer), Rose explained that the puzzle represents a class of mathematical problems that, on larger scales, classical computers are ill-equipped to handle, problems that crop up frequently in business contexts like route planning and database searching.
Pitched to venture capitalists and potential corporate customers, the demo was long on vision and short on quantum mechanics, and the scientific community was underwhelmed; by choosing not to submit their results to peer review before unveiling the computer, D-Wave was skipping the crucial step by which science legitimizes new discoveries. More surprising was the cursory coverage the announcement received in the press. Even without peer-reviewed results, “you would have expected there to be at least some initial spike of excitement,” says University of Waterloo researcher Jan Kycia. The media, it appears, simply didn’t realize how enormously significant a development a working quantum computer would be, especially one invented by a seventy-five person Canadian start-up not funded by the US government.
The motto of the United States’s National Security Agency (nsa), etched in a plaque across the road from the spy agency’s sprawling headquarters off Route 295 in Maryland, is “Always out front.” In the world of modern military intelligence, that primarily means staying ahead of rivals in the race for innovative technology to help monitor, decrypt, and analyze surveillance data. This need is the engine that has driven quantum-computing research since 1994, when then at&t computer scientist Peter Shor made an unexpected discovery. “Shor’s algorithm” proved that, in the unlikely event that a quantum computer could be built, it would be able to calculate the factors of very large numbers in a short time. While it’s easy to determine that the factors of twenty-one are three and seven, finding the factors of a number with 300 digits would take several millennia for any supercomputer yet built. Since large, impossible-to-factor numbers are used to encrypt everything from secure Internet-banking transactions to top-secret government communications, Shor’s algorithm had immediate implications for the nsa. Not only is it interested in “reading Osama’s email,” as some researchers put it, but it has to ensure that encrypted US government communications can’t be decoded down the road by yet-to-be-invented technology.
The nsa began pouring money into quantum computing, which up to then had been an obscure idea viewed mainly as a thought experiment, and it was soon joined by other agencies in the Department of Defense. By spreading funding to groups outside the United States, nsa program administrators also kept their fingers on the pulse of progress in virtually every significant research effort in the field. This year, US government spending on quantum-computing research reached $60 million, according to an nsa estimate—a hefty sum for a program whose most concrete progress after more than a decade remains a 2001 experiment that calculated that the factors of fifteen are three and five.
Viewed in this light, D-Wave’s Sudoku demonstration looks a little more impressive, especially since Rose says the company has never applied for or received money from the nsa or its sister agencies. D-Wave bills itself as the world’s only dedicated commercial quantum-computing enterprise, having raised about $45 million from angel investors and venture capitalists.
Success for D-Wave could be seen as the nsa’s worst nightmare: a breakthrough by a privately held foreign company that doesn’t disclose its results through the usual scientific channels. It would also run counter to the prevailing wisdom that military funding of basic (knowledge-for-its-own-sake) research is the best means of producing technological innovation. Give money to quantum physicists or astronomers, the argument goes, and your quest for a greater understanding of the universe will produce by-products like lasers and moon landings.
An opposing school of thought, articu-lated by historian Paul Forman, argues that post-World War II military funding has both diverted scientists from the pursuit of knowledge and proven to be a mostly ineffective way of developing new technology. “Most technological advance is incremental,” says Forman, a curator at the Smithsonian’s National Museum of American History in Washington, “and incremental technological advance is not, by definition, coming from basic research.” D-Wave, in contrast to its nsa-funded counterparts in academia, is trying to build a computer, not make new discoveries about quantum mechanics, an approach that may give them an advantage. “It’s the computation part that is important,” Rose says, “not the quantum part.”
Classical computers encode their data as a series of binary bits, which can take on one of two different values, usually denoted by zero and one. Quantum computers, on the other hand, take advantage of the rules of quantum mechanics, which were developed to explain a series of puzzling experimental results in the early twentieth century. It turns out that the rules for governing the motion of very small particles such as electrons are wildly different from the ones we’ve observed from, say, throwing baseballs around. An electron can be in two places at once, it can “tunnel” through walls, and it can teleport. As a result, a “qubit,” the quantum version of a bit, can be zero, one, or—and this is the crucial part—zero and one at the same time. This is where the remarkable power of quantum computers resides, and it increases exponentially if you combine qubits together: a group of just sixteen qubits can represent 65,536 different numbers simultaneously.
Cracking a code is essentially like trying to guess a number between one and a trillion (the harder the code, the bigger the number). While a classical computer would have to try each number in succession until it found the right one, a quantum computer could try all the numbers at once. This popular explanation of how a quantum computer works, however, isn’t quite right. After all, if you ask a trillion questions at once (“Is zero through a trillion the right password? ”), the answer you get back (“Yes”) isn’t very useful. Shor’s algorithm does indeed involve manipulating qubits that represent many numbers simultaneously, but it requires a more circuitous search for patterns that enable it to reject the wrong passwords and spit out only the right one.
However you think about it, the result is a computer that can do things that were previously impossible, such as factoring huge numbers quickly. But factoring is far from the only application of quantum computing, though it has dominated the agenda thanks to the nsa’s interest and bankroll. “I’ve always thought that Shor’s algorithm has been a very big negative overall for quantum computing, because it’s thrown everybody off track,” D-Wave’s Rose says. “The really valuable applications are not in code-breaking—those are not interesting from an industrial perspective, because they don’t lead to recurring large markets.”
The most radical application for quantum computers is something called quantum simulation. If you wanted to know whether a certain complicated molecule would make a good drug, you could, in theory, solve the quantum-mechanical equations that govern the motion of all the electrons and protons and other components of the molecule, which would tell you exactly how it will behave. But the math is too complex for us to puzzle out. Because qubits follow the same quantum mechanical rules that electrons and protons do, a quantum computer actually embodies those equations and can solve them effortlessly. If the computer is powerful enough, it could model just about anything in the universe. “This is not like going from the Pentium II to the Pentium III,” says Ray Laflamme, director of Waterloo’s Institute for Quantum Computing, the world’s largest dedicated quantum-computing research centre. “It’s fundamentally different.”
The difference is such that no one has really been sure how to build a quantum computer. The qubit has to be something that can physically take on two values representing zero and one, and there are wildly different ideas about what that should be. The computer that factored fifteen in 2001 was a seven-qubit system that used nuclear magnetic resonance (the same process used for mri imaging) to toggle the spins of a test tube of carbon and fluorine atoms. This approach is great for toy-sized computers, but seemingly impossible to scale up to the thousands or millions of qubits needed for a useful quantum computer. The opposite is true for superconductor-based systems, in which zero and one are sometimes represented by electrical currents flowing either clockwise or counter-clockwise around a loop of superconducting metal. Building even a few superconducting qubits has proven to be exceptionally challenging, but if that hurdle is cleared, it should be relatively straightforward to scale the system up using production techniques already developed by the computer industry.
D-Wave’s demonstration system is a sixteen-qubit superconductor-based machine. This is remarkable because, of all the groups around the world working on superconducting qubits, none have succeeded in operating more than two qubits together. The problem is that the quantum state that allows zero and one to exist simultaneously is extremely sensitive to outside perturbations. Noise from the outside environment, stray electrical signals, and tiny temperature fluctuations can all cause the qubit to lose its quantum-mechanical properties, a process known as decoherence. For scientists confronted with D-Wave’s claims that they’ve operated a sixteen-qubit computer, the immediate question is: how did they overcome decoherence? “If you go and ask anybody in the world who’s leading in putting qubits together, the first thing you’ll ask them is, ‘What is the decoherence time? ’” says Laflamme. “And if they say ‘I don’t know,’ you’ll be very skeptical of the rest of the thing.”
D-Wave has certainly met with skepticism, but Rose, who won three Canadian wrestling titles before starting his Ph.D. in theoretical physics at ubc, seems to relish the fight. D-Wave, he explains, has adopted a modified quantum-computer architecture that, while not capable of executing Shor’s algorithm, has the advantage of being less sensitive to decoherence and can still tackle a number of other commercially interesting algorithms. He insists that the emphasis on the computer’s building blocks is a diversion from the real test of a quantum computer, which is how much time it needs to tackle problems of various sizes. “That’s really the only smoking gun for quantum computation,” he says. The current sixteen-qubit system is too small to test performance on large problems, but D-Wave’s ambitious road map calls for a thirty-two-qubit system by the end of this year, and 1,024 qubits by the third quarter of 2008. While scientists are clamouring for evidence that D-Wave’s computer is truly quantum, as far as Rose is concerned, the question will be answered purely on the basis of the computer’s performance within the next year. “And you know, it could turn out that the whole thing won’t work,” he says. “That’s always a risk. But it won’t be for lack of effort, I’ll tell you that much.”
When I finished my physics Ph.D. and headed out on the post-doctoral job-interview circuit in late 2001, interest in quantum computing was exploding. I added a couple of slides to the end of my standard presentation, drawing an extremely tenuous link between my thesis work and quantum computing, and even went so far as to insert “quantum computing” in the title of a talk I gave at a university in the Netherlands. I finally accepted a position in the quantum-computing group at an nsa lab in Maryland, where, despite the name, the research we did had little connection to the world of qubits. Instead, we were exploring the boundaries between the classical and quantum worlds. We know, for instance, that an atom can be in two places at once, and a baseball can’t. But what about ten atoms? Or 1,000 atoms? Where is the dividing line between atoms and baseballs? You could argue—as we did to our funders—that it would be useful to know this in order to build a quantum computer, but, really, our primary interest was trying to answer one of the great questions of modern physics.
The duplicitous game of justifying fundamental science by promising that it will lead to a magic computing machine or some equivalent is, increasingly, a fact of life for scientists in every field. Forman, the Smithsonian historian, argues that the transition from modernity to postmodernity three decades ago was marked by a dramatic flip in the status of science and technology. Until then, “science was very closely connected with knowledge for its own sake,” he says. Now, the process of scientific inquiry is only justified by its ends: useful gadgets. “Our postmodern perspective really does put the very existence of science into question.” For those of us who view understanding our universe as a worthy goal, we can be grudgingly grateful that the military’s approach to technology development still leaves room, however constrained, for basic science.
There is also room for D-Wave’s purely technological, engineering-style approach. Their glossy demonstration in February may have left some scientists fuming, but it spurred plenty of ideas for applications from potential industry partners, Rose says. As the company forges ahead, intent on doubling and redoubling the number of qubits in its system rather than painstakingly ensuring that the qubits it already has are really behaving as they should, scientists will continue to be irritated. But what D-Wave is trying to do shouldn’t be confused with science: it is simply trying to build a computer, using technology that may or may not be ready for the task. In that spirit, it’s worth remembering Rose’s words as he stood at the podium demonstrating a final sample application: the famously difficult mathematical problem of finding the optimal seating plan for a wedding. The list of guests and their seating requirements was sent from the California auditorium to the system in British Columbia, which churned out the answer. As the giant screen displayed a solution that separated two guests who wanted to sit together, Rose turned to face the audience, eyes blinking innocently, and said, “Sometimes you just can’t satisfy everybody.”
Alex Hutchinson is a fitness and travel writer, and a frequent Walrus contributor. He writes the Globe and Mail’s Jockology column.