Skip to Main Content
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

Is Quantum Computing Closer to Reality?

Quantum computing has been discussed for a long time, but now seems to be getting closer to reality, with some big advances.

November 18, 2015
quantum-panel

Quantum computing—the idea of working with computers that show quantum properties, such as being able to hold multiple states at the same time—has been discussed for a long time, but now seems to be getting closer to reality, with some big advances. At last week's Techonomy conference, I had the opportunity to host a panel on the subject with leaders of some of the companies that are pushing the envelope on this topic, including D-Wave and IBM.

Bryan Jacobs, a consultant at Berberian & Company, which offers advice on quantum computing, explained that in all the electronics we use today, information is stored through the charge of an electron which is either on or off; in other words, a bit. But if you encode the information in a quantum state, like a single electron or a photon, you can map that into a zero and a one, just like a regular classical bit, but also a superposition, where it can be zero and one simultaneously. He explained that the interesting notion is that if you have a quantum computer that has a large number of these quantum bits—often called qubits—you can start it in a superposition of all possible inputs at the same time, and then, if you can process information in a quantum coherent way, in some sense you can calculate the same function on all possible inputs simultaneously. It's known as quantum parallelism. He noted that there are a couple of different approaches people are trying today - one is gate-based, which is more like traditional digital computers, and the other is kind of akin to an analogue process, known as quantum annealing.

Vern Brownell, CEO of D-Wave Systems, which has actually delivered a few machines that use quantum annealing, said his company chose to use that approach first "because we thought that that was going to give us capability faster than any other type of quantum computing implementation." He said D-Wave looked at other models of quantum computing as well, but this approach was the most pragmatic.

He explained that effectively he has a quantum annealer with a thousand qubits, which are able to explore an answer space of two–to-the-number-of-qubits different possibilities. Essentially this works on complex optimization problems, and looks to find the lowest energy or the best answer for that optimization problem. Brownell noted that Google has now upgraded a previously purchased machine for its quantum artificial intelligence laboratory, examining how this can assist in machine learning. Another customer is Lockheed, which is looking at a problem called software verification and validation.

Brownell acknowledged that neither of these examples has really gone into production yet, but said they have run real applications that are solving real problems at scale. In other words, they haven't yet hit the point where the D-Wave machine is outperforming classical supercomputers, but he said "we're very close to that." In the next few months, the company will be showing "that a quantum computer can outperform the best of what classical computing can do. We're at that hinge point right now."

ritter-and-quantum-gate-chip

Mark Ritter, distinguished research staff member and senior manager in the physical sciences department at the IBM TJ Watson Research Center, explained that his team is doing a number of different quantum projects, but has focused its work on gate-based quantum computing and error correction.

One of the theorists on his team, Sergey Bravyi, invented "a topological parity code." He explained that we use error correction codes in traditional computers as well, but that quantum information is very fragile, so to make a gate-based system, you need a code to protect that fragile quantum information. His team created a 4-qubit system, with qubits called "transmons" that can retain some of the quantum information for a longer period and with the error correction code can create gate-based quantum computing. He said this is like a square lattice where the qubits are at the vertices of graph paper; an algorithm then superimposes this code over the qubits.  IBM's goal is to be able to add more and more qubits to that algorithm. He said soon it may be able to preserve the quantum state indefinitely.

He noted how quantum gates use entanglement across all the qubits and look at all the potential states, comparing this to the interference pattern you see when you drop a lot of stones in a pond, and get constructive and destructive interference. The best answer will be constructively interfered, he said, and this answer will be the only answer you end up with, if there is a single answer to the problem. In a gate-based quantum computer, he said, you can employ the interference in this coding to get an answer at the end of the process, and that this should be exponentially sped up for certain algorithms.

While this may still be a way off, Ritter said people are also thinking about using the qubits to run analog simulations with high coherence, such as simulating various molecules. Jacobs agreed about quantum simulation, and talked about chemical simulations of stable molecules to find drugs.

I asked about Shor's algorithm, which suggests that with a quantum computer, you could break much of conventional cryptography. Jacobs used the analogy of a rocket ship trying to send astronauts to the moon. Jacobs said the algorithm that executes the problem we're trying to solve, such as Shor's algorithm, is similar to the command module of the rocket ship, and that the error correction—such as what Ritter's team is working on—is like the stages of the rocket. But, he said, the types of fuel or rocket engine motors we have right now are not sufficient for any size rocket ship. He said it is a very tricky question, and that all of the overhead associated with doing the quantum computations and the error correction means that many of the algorithms that look really promising today may not pan out. Brownell said he thought we have a decade or more before quantum computers could break RSA encryption and we'll have to move to post-quantum cryptography.

Brownell emphasized that the gate model of quantum computing is very different from quantum annealing, and talked about how useful it is when solving certain optimization problems today. He also said it can nearly solve problems that are beyond the reach of classical computers. On some benchmarks, he noted, Google has found that the D-Wave machine could solve problems at somewhere on the order of 30-100,000x faster than a general purpose algorithm could today. While this was not a useful algorithm, he said his team is focusing on actual use case algorithms that can take advantage of this capability as its processor scales in performance every 12-18 months.

Brownell compared quantum computing today to Intel in 1974 when it came out with the first microprocessor. He was with Digital Equipment Corp. at that point, and said that at the time "we weren't particularly worried about Intel, because they had these cheap little microprocessors that were nowhere near as powerful as these big boxes and stuff that we had. But within a matter of ten years, you know, [our] business was completely gone and Digital went out of business." He said that while he didn't think quantum computing would threaten the entire classical computing world, he does expect to see these incremental improvements in processors every 18 months, to a point where it's going to be a capability that will be necessary for IT managers and developers to use.

In particular, he said, D-Wave has co-developed probabilistic learning algorithms, some of them in the deep learning space, that can do a better job of recognizing things and in training than can be done without quantum computing. Eventually, he sees this as a resource in the cloud that will be used very much in compliment with classical computers.

Ritter said it was hard to really compare any of the quantum methods against classical machines that are executing general purpose computing, because people are making accelerators, and using GPUs and FPGAs designed for specific tasks. He said that if you actually designed an ASIC that was specific to solving your problem, real quantum computing with real acceleration should beat any of them, because every qubit you add doubles that configuration space. In other words, putting a thousand qubits together should increase the space by 2x1000th power, which he noted is more than the number of atoms in the universe. And, he said, with a gate-based computer, the problem is that the gates operate slower than your cell phone, so you have more operations happening at once, but each operation is slower than on a classical computer. "That's why you have to make a bigger machine before you see this crossover," he said.

Jacobs pointed out how much more efficient quantum computing could be. "If you look at the power it takes using the best super green super computers in the world, if you wanted to do about a 65 qubit simulation, that would require about one nuclear power plant," he said, "and then if you wanted to do 66 it would require two nuclear power plants."

Brownell said that with more than 1,000 qubits, the current D-Wave machine could theoretically handle models of up to 2 to the 1000th, equivalent to 10 to the 300th. (For comparison, he said, scientists estimate there are only about 10 to the 80th atoms in the universe.)  So he says the limits in performance on the computer are not due to limitations in quantum annealing, but rather to limitation in the I/O functions, an engineering issue that is being addressed in each new generation. On some of the benchmark algorithms, the company's 1152-qubit machine should be 600 times more powerful than the best of what classical computers can do, he claims.

D-Wave's architecture, which uses a matrix of qubits with couplings that in some ways resembles a neural network, has had initial application to deep learning neural networks in machine learning.

But he also talked about other applications, such as running the equivalent of Monte Carlo simulations, which he used to do at Goldman Sachs (where he was CIO) for value-risk calculations. He remembered this taking about a million cores and having to run overnight. Theoretically, a quantum computer could do similar things with much less energy. He said the D-Wave machine uses very little, but needs to run within a large refrigerator that maintains a very low temperatures (about 8 milikelvin), but that the machine itself takes only about 15-20 kW to run, which is quite small for a data center.

Ritter mentioned a similar idea for the gate-based model, and discussed quantum metropolis sampling which he said is the equivalent of quantum Monte Carlo, but with different statistics because of the entanglement properties.

Ritter's  team is working on quantum analog simulation, where it can calculate and map a molecular design into a connection of qubits and have it solve the ideal modes and all the behaviors of a molecule, which he said is very hard once you get around 50 electrons.

Jacobs discussed quantum cryptography, which involves a key that is generated in a way that can prove that nobody was listening in on the transmission.  Ritter said IBM's Charlie Bennett theorized a technique for "teleporting" the qubit on the link into another qubit in the machine, but said he thinks such techniques are more than a few years out.

Jacobs pointed out the differences between quantum gate computing and quantum annealing, particularly in the areas of error correction, and noted that there is another method as well called topological quantum computing that Microsoft is working on.

One interesting challenge is writing applications for such machines, which Ritter described as sending tones in a specific frequency that cause the different qubits to resonate and interact with each other in time, which causes the computation to occur "almost like a musical score." He noted that there are higher level languages, but that a lot of work still requires a theorist. Jacobs noted that there are different levels of open source quantum languages such as QASM and Quipper, both focused largely on the quantum gate model. Brownell noted there hasn't been as much activity on quantum annealing, because it was more controversial until recently, and said D-Wave has had to do a lot of that work itself, and is working on moving languages to higher levels. Within five years he hopes it will be as easy to use as a GPU or other kind of classical resource.

Get Our Best Stories!

Sign up for What's New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

TRENDING

About Michael J. Miller

Former Editor in Chief

Michael J. Miller is chief information officer at Ziff Brothers Investments, a private investment firm. From 1991 to 2005, Miller was editor-in-chief of PC Magazine,responsible for the editorial direction, quality, and presentation of the world's largest computer publication. No investment advice is offered in this column. All duties are disclaimed. Miller works separately for a private investment firm which may at any time invest in companies whose products are discussed, and no disclosure of securities transactions will be made.

Read Michael J.'s full bio

Read the latest from Michael J. Miller