The race to building a fully functional quantum stack

Quantum computers exploit the seemingly bizarre yet proven nature of the universe that until a particle interacts with another, its position, speed, color, spin and other quantum properties coexist simultaneously as a probability distribution over all possibilities in a state known as superposition. Quantum computers use isolated particles as their most basic building blocks, relying on any one of these quantum properties to represent the state of a quantum bit (or “qubit”). So while classical computer bits always exist in a mutually exclusive state of either 0 (low energy) or 1 (high energy), qubits in superposition coexist simultaneously in both states as 0 and 1.

Things get interesting at a larger scale, as QC systems are capable of isolating a group of entangled particles, which all share a single state of superposition. While a single qubit coexists in two states, a set of eight entangled qubits (or “8Q”), for example, simultaneously occupies all 2^8 (or 256) possible states, effectively processing all these states in parallel. It would take 57Q (representing 2^57 parallel states) for a QC to outperform even the world’s strongest classical supercomputer. A 64Q computer would surpass it by 100x (clearly achieving quantum advantage) and a 128Q computer would surpass it a quintillion times.

In the race to develop these computers, nature has inserted two major speed bumps. First, isolated quantum particles are highly unstable, and so quantum circuits must execute within extremely short periods of coherence. Second, measuring the output energy level of subatomic qubits requires extreme levels of accuracy that tiny deviations commonly thwart. Informed by university research, leading QC companies like IBM, Google, Honeywell and Rigetti develop quantum engineering and error-correction methods to overcome these challenges as they scale the number of qubits they can process.

Following the challenge to create working hardware, software must be developed to harvest the benefits of parallelism even though we cannot see what is happening inside a quantum circuit without losing superposition. When we measure the output value of a quantum circuit’s entangled qubits, the superposition collapses into just one of the many possible outcomes. Sometimes, though, the output yields clues that qubits weirdly interfered with themselves (that is, with their probabilistic counterparts) inside the circuit.

QC scientists at UC Berkeley, University of Toronto, University of Waterloo, UT Sydney and elsewhere are now developing a fundamentally new class of algorithms that detect the absence or presence of interference patterns in QC output to cleverly glean information about what happened inside.

The QC stack

A fully functional QC must, therefore, incorporate several layers of a novel technology stack, incorporating both hardware and software components. At the top of the stack sits the application software for solving problems in chemistry, logistics, etc. The application typically makes API calls to a software layer beneath it (loosely referred to as a “compiler”) that translates function calls into circuits to implement them. Beneath the compiler sits a classical computer that feeds circuit changes and inputs to the Quantum Processing Unit (QPU) beneath it. The QPU typically has an error-correction layer, an analog processing unit to transmit analog inputs to the quantum circuit and measure its analog outputs, and the quantum processor itself, which houses the isolated, entangled particles.

As QC evolves over the coming decades, vendors will carve out their own blocks of this evolving stack, but today’s pioneers do not have the luxury of such focus. Today’s QC systems are still too complex, unique and finicky to expose to the ecosystem. The teams who develop a proprietary, integrated stack will likely be first to achieve practical quantum advantage.

QC modalities

The various competing architectures for quantum processors exploit a wide range of quantum particle properties (“modalities”) to represent qubits. Each prevailing architecture has different advantages around coherence times, measurement fidelity, design scalability and operating scalability. The industry has consolidated around a few that currently offer the most immediate and clear path to large-scale systems; chief among them is superconducting qubits. This approach is used by the three current leaders in the race — IBM, Google and Rigetti — and relies on circuits of superconducting material kept at cryogenic temperatures. It offers superior latency (~50 nanoseconds per operation) compared to other modalities but limited coherence times (qubits remain entangled for only ~50 microseconds). Recent demonstrations of intermediate-scale systems suggest it will be the first modality to enable practical quantum advantage.

Meanwhile, Honeywell, IonQ and others are pursuing another leading modality, trapped ion qubits, which relies on a combination of electronic and magnetic fields to capture charged particles in an isolated system. This architecture faces challenges around its scale-out potential and latency, but offers a significantly superior coherence, as its qubits can stay entangled for nearly a minute. Other frontier modalities at earlier stages of research and commercialization — such as cold atom arrays, photonics-based qubits, and topological qubits — may eventually succeed with their own sets of advantages, arriving later to market.

A proliferation of distinct QC systems could eventually enable users and organizations to choose different modalities for different use-cases. But as with other technology markets, the first modality upon which quantum advantage is demonstrated could well dominate the field for generations, primarily as future investments naturally flow toward the most mature technologies. As of 2020, superconducting qubits seem poised to enable the first practical cases of quantum advantage.

The race is a marathon

In 1980, at a series of lectures at MIT and Caltech, Berkeley professor Paul Benioff defined the concept of a quantum circuit, and Richard Feynman floated the basic model for building such a computer. Commercial development ramped up only recently, though in 1999, D-Wave began its successful development of a 2000Q annealer for solving specialized optimization problems.

Significant commercial progress toward generalized quantum advantage was first demonstrated in 2019, suggesting an inflection point in quantum tech. Within the span of a few months, three superconducting qubit vendors — IBM, Google and Rigetti — presented working systems in the 32Q-54Q range (at 95%-99% fidelity) with credible roadmaps for building 100Q+ systems capable of demonstrating quantum advantage. Trapped ion vendors Honeywell and IonQ presented longer-term strategies to reach a similar scale with coherence times that accommodate more complex computations. PsiQuantum, an early pioneer of the photonics modality, raised a $230 million venture round. Perhaps most notably, Amazon, IBM and Microsoft announced they will distribute quantum computing services as part of their commercial cloud offerings.

With dozens of corporates and startups pursuing the development of quantum technology, it is difficult to say who will first demonstrate quantum advantage or when such a demonstration will first occur. Many predict that it will happen within the next few years, possibly as early as 2021.

Of course, the end of this race marks the start of another. In the coming decade, QC vendors will compete by increasing scale and minimizing errors. The near-term winners will likely employ hybrid architectures, leveraging both quantum and classical compute units for their relative advantages, and tailor their products to industries that require incremental improvements, rather than fully accurate results, such as supply-chain management, optimization of logistical distribution, financial analysis and seismic research.

By the 2030s, a small number of leading vendors should hopefully be far enough along to develop scalable, error-free QCs with high coherence, enabling QC software vendors to tackle all the problems currently addressed by classical supercomputers for pharmaceuticals, finance, chemical engineering, AI, agriculture and logistics.

We shouldn’t expect quantum computers in our homes or the palms of our hands. But the immense computational power that QC can unleash will lead to life-changing medicines, more accurate weather forecasts, smarter AIs, longer-lasting batteries, safer space travel, sustainable energy sources and benefits society has yet to even discover. As we approach the age of quantum computing, it is no longer a question of “if,” but rather one of “when” this technology finally matures and “who” will lead this emerging industry. The race is on — may the best team win.