Both human and hardware —

Quantum computing’s future is almost semi-here—are we ready for it?

As we approach useful hardware, human elements of computing are becoming critical.

The future of computing is... a big metal tank? If it turns out there's just a guy with a laptop in there doing Google searches, I'm going to be very disappointed.
Enlarge / The future of computing is... a big metal tank? If it turns out there's just a guy with a laptop in there doing Google searches, I'm going to be very disappointed.
John Timmer

YORKTOWN HEIGHTS, NY—I'm in a room with one possible future for computing. The computer itself is completely unimposing, looking like a metal tank suspended from the ceiling. What makes an impression is the noise, a regular metallic ping that dominates the room. It's the sound of a cooling system designed to take hardware to the edge of absolute zero. And the hardware being cooled isn't a standard chip; it's IBM's take on quantum computing.

In 2016, IBM made a lot of noise when it invited the public to try out an early iteration of its quantum computer, hosting only five qubits—far too few qubits to do any serious calculations but more than enough for people to gain some real-world experience with programming on the new technology. Amidst some rapid progress, IBM installed more tanks in its quantum computing room and added new processors as they were ready. As the company scaled up the number of qubits to 20, it optimistically announced that 50-qubit hardware was on its way.

During our recent visit to IBM's Thomas Watson Research Center, the company's researchers were far more circumspect, being clear they weren't making promises and that 50-qubit hardware is just a stepping stone toward quantum computing's future. But they did make the case that IBM was well-positioned to be part of that future, in part because of the ecosystem the company is building up around these early efforts.

Building blocks to chips

For its qubits, IBM uses superconducting wires linked to a resonator, all built on top of a silicon wafer. The wire and wafer let the company leverage its experience building circuitry, but in this case, the wire is a mix of niobium and aluminum, which allows it to superconduct at extremely low temperatures. Jerry Chow, who showed us around the hardware testing room, says the company is still experimenting with the details of how to improve its qubits, testing different formulations and geometries individually or in pairs.

The resonator is sensitive to microwave frequencies, allowing each individual qubit to be set or read out using a microwave pulse. Each chip contains optical elements that take external microwave input and direct it to individual qubits. There's nothing special about the microwaves themselves, so the input is created using off-the-shelf optics. The only challenge is getting the input to the chip, deep in its liquid-helium-cooled tank. The hardware to do so not only has to withstand the extremely cold temperatures but survive being warmed back to room temperature. (Although, once it's cooled, the hardware can operate indefinitely without replacement.)

Quantum computing relies on entanglement among these qubits. Chow told Ars that, to entangle any two of these qubits, you can rely on the fact that they have slightly different resonant frequencies. If you address each member of a qubit pair using the resonant frequency of its partner, then it's possible to entangle them. Collections of pairs can then be entangled into higher-order entangled systems. The qubits remain coherent for 100 microseconds at a time, but the entanglement of a qubit pair can be done in about 10 nanoseconds. Chow said entangling a chip currently takes a few microseconds, allowing sufficient time to prepare the whole system and perform calculations.

If it's all this straightforward, why haven't we already seen the 50-qubit chip?

The problem is that the qubits are extremely sensitive to environmental noise. This can be noise from outside the device (although the metal tank helps shield the chip from a lot of that). But it can also come from inside—the cooling system, microwave cabling, and the chip components themselves can all interact with the qubits. And any sort of interaction is disastrous for calculations.

That means changing anything about the chip's architecture, even adding a single qubit, has the potential to change the frequency and type of errors when the chip is performing calculations. IBM does extensive modeling to try to limit this problem before a chip is made, but, to a certain extent, it's an empirical and iterative process: use a chip and see what happens. "Building more qubits will help us identify sources of noise and crosstalk," said Chow.

That was echoed by Sarah Sheldon, one of the scientists working on the microwave systems that control and read the qubits. "We have good tools for characterizing individual components but don't have efficient means of characterizing whole devices," Sheldon said. "As a system gets bigger, we face situations where control of one qubit may cause errors elsewhere." Later, she added, "We're approaching the limit where you can't simulate these devices classically—how do you tell it's operating properly?"

Channel Ars Technica