How D-Wave Built Quantum Computing Hardware for the Next Generation

By admin,

  Filed under: Quantum Computing
  Comments: Comments Off on How D-Wave Built Quantum Computing Hardware for the Next Generation

By Jeremy Hsu
11 Jul 2014

Photo: D-Wave Systems

One second is here and gone before most of us can think about it. But a delay of one second can seem like an eternity in a quantum computer capable of running calculations in millionths of a second. That’s why engineers at D-Wave Systems worked hard to eliminate the one-second computing delay that existed in the D-Wave One—the first-generation version of what the company describes as the world’s first commercial quantum computer.

Such lessons learned from operating D-Wave One helped shape the hardware design of D-Wave Two, a second-generation machine that has already been leased by customers such as Google, NASA, and Lockheed Martin. Such machines have not yet proven that they can definitely outperform classical computers in a way that would support D-Wave’s particular approach to building quantum computers. But the hardware design philosophy behind D-Wave’s quantum computing architecture points to how researchers could build increasingly more powerful quantum computers in the future.

We have room for increasing the complexity of the D-Wave chip,” says Jeremy Hilton, vice president of processor development at D-Wave Systems. “If we can fix the number of control lines per processor regardless of size, we can call it truly scalable quantum computing technology.

D-Wave recently explained the hardware design choices it made in going from D-Wave One to D-Wave Two in the June 2014 issue of the journal IEEE Transactions on Applied Superconductivity. Such details illustrate the engineering challenges that researchers still face in building a practical quantum computer capable of surpassing classical computers. (See IEEE Spectrum’s overview of the D-Wave machines’ performance from the December 2013 issue.)


Photo: D-Wave SystemsD-Wave’s Year of Computing Dangerously

Quantum computing holds the promise of speedily solving tough problems that ordinary computers would take practically forever to crack. Unlike classical computing that represents information as bits of either a 1 or 0, quantum computers take advantage of quantum bits (qubits) that can exist as both a 1 and 0 at the same time, enabling them to perform many simultaneous calculations.

Classical computer hardware has relied upon silicon transistors that can switch between “on” and “off” to represent the 1 or 0 in digital information. By comparison, D-Wave’s quantum computing hardware relies on metal loops of niobium that have tiny electrical currents running through them. A current running counterclockwise through the loop creates a tiny magnetic field pointing up, whereas a clockwise current leads to a magnetic field pointing down. Those two magnetic field states represent the equivalent of 1 or 0.

The niobium loops become superconductors when chilled to frigid temperatures of 20 millikelvin (-273 degrees C). At such low temperatures, the currents and magnetic fields can enter the strange quantum state known as “superposition” that allows them to represent both 1 and 0 states simultaneously. That allows D-Wave to use these “superconducting qubits” as the building blocks for making a quantum computing machine. Each loop also contains a number of Josephson junctions—two layers of superconductor separated by a thin insulating layer—that act as a framework of switches for routing magnetic pulses to the correct locations.

But a bunch of superconducting qubits and their connecting couplers—separate superconducting loops that allow qubits to exchange information—won’t do any computing all by themselves. D-Wave initially thought it would rely on analog control lines that could apply a magnetic field to the superconducting qubits and control their quantum states in that manner. However, the company realized early on in development that it would need at least six or seven control lines per qubit, for a programmable computer. The dream of eventually building more powerful machines with thousands of qubits would become an “impossible engineering challenge” with such design requirements, Hilton says.

The solution came in the form of digital-to-analog flux converters (DAC)—each about the size of a human red blood cell at 10 micrometers in width— that act as control devices and sit directly on the quantum computer chip. Such devices can replace control lines by acting as a form of programmable magnetic memory that produces a static magnetic field to affect nearby qubits. D-Wave can reprogram the DACs digitally to change the “bias” of their magnetic fields, which in turn affects the quantum computing operations.

Most researchers have focused on building quantum computers using the traditional logic-gate model of computing. But D-Wave has focused on a more specialized approach known as “quantum annealing —a method of tackling optimization problems. Solving optimization problems means finding the lowest “valley” that represents the best solution in a problem “landscape” with peaks and valleys. In practical terms, D-Wave starts a group of qubits in their lowest energy state and then gradually turns on interactions between the qubits, which encodes a quantum algorithm. When the qubits settle back down in their new lowest-energy state, D-Wave can read out the qubits to get the results.

Both the D-Wave One (128 qubits) and D-Wave Two (512 qubits) processors have DACs. But the circuitry setup of D-Wave One created some problems between the programming DAC phase and the quantum annealing operations phase. Specifically, the D-Wave One programming phase temporarily raised the temperature to as much as 500 millikelvin, which only dropped back down to the 20 millikelvin temperature necessary for quantum annealing after one second. That’s a significant delay for a machine that can perform quantum annealing in just 20 microseconds (20 millionths of a second).

By simplifying the hardware architecture and adding some more control lines, D-Wave managed to largely eliminate the temperature rise. That in turn reduced the post-programming delay to about 10 milliseconds (10 thousandths of a second)— a “factor of 100 improvement achieved within one processor generation,” Hilton says. D-Wave also managed to reduce the physical size of the DAC “footprint” by about 50 percent in D-Wave Two.

Building ever-larger arrays of qubits continues to challenge D-Wave’s engineers. They must always be aware of how their hardware design—packed with many classical computing components—can affect the fragile quantum states and lead to errors or noise that overwhelms the quantum annealing operations.

We were nervous about going down this path,” Hilton says. “This architecture requires the qubits and the quantum devices to be intermingled with all these big classical objects. The threat you worry about is noise and impact of all this stuff hanging around the qubits. Traditional experiments in quantum computing have qubits in almost perfect isolation. But if you want quantum computing to be scalable, it will have to be immersed in a sea of computing complexity.”

Still, D-Wave’s current hardware architecture, code-named “Chimera,” should be capable of building quantum computing machines of up to 8000 qubits, Hilton says. The company is also working on building a larger processor containing 1000 qubits.

The architecture isn’t necessarily going to stay the same, because we’re constantly learning about performance and other factors,” Hilton says. “But each time we implement a generation, we try to give it some legs so we know it’s extendable.”

Comments are closed for this post.