• Feite Kraay, Author |
6 min read

​Experienced racers know that the best strategy to achieve a win is not to start out too fast. Many novices, having rested and fueled in the days beforehand, will burst off the starting line full of energy and ambition only to fade out in the home stretch and struggle to a disappointing finish—if they finish at all. It’s generally better to run what’s called a ‘negative split,’ which is when the second half of the race is done slightly faster than the first half. A negative split allows the athlete to conserve their energy for when it’s most needed—overtaking competitors in the second half and powering through to a strong finish. It takes an extraordinary amount of self-discipline and I don’t often manage it myself—though in my personal-best half marathon a couple of years ago, I ran the second half of the race six seconds faster than the first. The negative split is doable.

In my last post, I described how quantum computers work and the potential they have to solve complex computationally intensive problems much faster than traditional or classical computers can. This potential is often referred to as ‘quantum supremacy’ or ‘quantum advantage.’ Precise definitions of these terms have long been a source of debate, but I like to think of quantum advantage as something that a quantum computer can perform significantly faster than a classical computer, and quantum supremacy as something that a quantum computer can perform and a classical computer simply can’t. Although various claims have been made for both, it’s safe to say that neither advantage nor supremacy has been realized yet in a commercially meaningful way.

If quantum computing is a race with advantage or supremacy as the finish line, then we have only just started our run. We should consider what it will mean to run a negative split and how we can finish strong without depleting our energy. We might even think of this as two races: one, among vendors, to deliver quantum computers that will actually provide advantage or supremacy; and two, among service providers and clients, to adopt quantum technology and realize its benefits while it continues to evolve.

Starting out noisy
The race to deliver commercially viable quantum computers seems more like an obstacle course with a very crowded field of competitors. At least ten different architectures for quantum hardware are being developed, each with advantages and disadvantages. In addition to the large hardware vendors and hyperscalar cloud service providers, there has also been a flurry of startup companies working on variations of these architectures for the last decade or more. Meanwhile, in the last couple of years the initial hype has started to subside and there are some signs of healthy consolidation in the industry. For the moment, large-scale cost-effective quantum computers are still five to ten years in the future.

The current state of the industry is known as NISQ, or Noisy Intermediate-Scale Quantum, a term that aptly encapsulates the key obstacles still holding us back from achieving true quantum advantage or supremacy. Intermediate-scale refers to the fact that the industry has grown from early experimental computers of only a handful of qubits to commercially available systems with hundreds of qubits. And yet we remain far from achieving the hundreds of thousands (or, some argue, even millions) of qubits necessary to achieve the scale at which advantage or supremacy will be realized.

Obviously more qubits will drive more computing power, but these qubits also have to function coherently. This brings us back to the first element of NISQ—Noise, which means anything that can interfere with the coherent operation of a qubit, from warm temperatures to cosmic radiation and even the activity of neighbouring qubits in the same system.

Noise is probably the biggest obstacle in the race to achieve large-scale commercial quantum computers. Recall from my previous post that qubits are based on sub-atomic particles like electrons and photons. We expect a lot of work from these tiny ephemeral particles, forcing them to hold vast amounts of data for long enough to be able to perform complex calculations. It turns out, qubits are extremely sensitive to even the slightest noise interfering in a quantum system.

To mitigate noise quantum computer manufacturers have gone to great lengths, such as cooling systems to nearly absolute zero (colder than outer space) and placing them in shielded rooms impervious to cosmic radiation. Some are experimenting with building multiple quantum circuits that can run in parallel, thus limiting the proximity of qubits to one another. Despite these heroic efforts to protect our delicate qubits, qubit longevity or “coherence time” is often measured in microseconds or less. That’s not a long time in which to do your calculations.

Noise cancellation
Given that the slightest bit of noise may cause a qubit to decohere and lose its data, the other side effect of noise in quantum systems is inaccuracy. If some qubits decohere in the middle of a calculation, well, you may not want to put too much faith in the result of that calculation.

I wrote previously about Shor’s Algorithm, which proves that a quantum computer can solve prime factorization millions of times faster than a classical computer. In the early 2000s, researchers at UC Santa Barbara implemented Shor’s Algorithm on a quantum computer and used it to find the prime factors of 15. The quantum computer arrived at the correct answer, 3 x 5, about half the time over 150,000 iterations of the problem. We might smile wryly at such a trivial example, but it proved two things: first, that Shor’s Algorithm works; second, that a lot of work still needs to be done.

Think of it as the first flight of the Wright Brothers—they didn’t get far, but they did prove that controlled flight was possible—and look at where the aviation and aerospace industries are now. Much progress has been made using the noise reduction techniques I described above, and at the same time quantum vendors are improving accuracy by building more redundancy into their systems. By creating logical qubits out of multiple physical qubits, scientists are better able to tell where an error might have occurred and make the necessary correction. However, many experts expect it will take systems of hundreds of thousands, even millions, of qubits before there’s enough error correction to perform accurate calculations on a large scale. Given that current quantum computers are measured in the hundreds of qubits (with a system of over 1,000 qubits expected later this year), their true value likely won’t be realized until the late 2020s or early 2030s.

I mentioned above that there are at least ten different approaches being taken today toward quantum computing. These can be classified into three types: gate-based quantum, quantum annealing, and quantum-inspired optimization. The problems associated with NISQ apply to the first type. Although gate-based quantum (the only type based on actual qubits) holds the greatest promise of exponential improvements, overly enthusiastic adopters of the technology take a similar risk to a runner starting too fast in a race—burning themselves out too soon.

The other two types of quantum computing available today take a different approach. Quantum annealing uses specialized hardware to simulate qubits and is tuned to solve specific optimization problems. Quantum-inspired optimization uses classical hardware and software algorithms to simulate qubits, and again is useful for some types of optimization problems. Both approaches are commercially available and have been proven to perform complex optimizations with up to ten or fifteen percent improvement in results over traditional classical methods. For projects such as financial portfolio optimization or geographical service coverage, this modest improvement is already a compelling value proposition. Algorithms coded for quantum annealing or quantum-inspired optimization could be re-executed on gate-based quantum in the future to yield even better results.

Keeping pace
Ultimately, running a successful race toward the goal of quantum advantage or supremacy looks something like this: All competitors keep their eye firmly on the finish line, achieving quantum supremacy through large-scale gate-based quantum hardware. We maintain our investments in research and development to overcome the problems of quantum noise and realize the large-scale benefits of quantum computing. And, in the meantime, we slowly and steadily continue to accumulate incremental wins by running our algorithms in quantum annealing or quantum-inspired environments.

Make no mistake, material benefits can still be had while we sidestep the obstacles in our path and gain the necessary experience and strength to reach the finish line.

Multilingual post

This post is also available in the following languages

Stay up to date with what matters to you

Gain access to personalized content based on your interests by signing up today