Quantum computing systems are moving from a first generation of highly error-prone machines of limited usefulness (known as the Noisy Intermediate Scale Quantum Computing or NISQ era) towards measurement-heavy systems with the capabilities necessary to implement quantum error correction.

At the IEEE Quantum Week in September 2022, a variety of workshops and keynotes all highlighted the emerging engineering challenges for getting quantum computing ‘to scale’. We also learned more about the burgeoning ecosystem that looks towards error corrected machines. The Riverlane-led workshop, `Beyond NISQ’, brought together stakeholders representing all the major commercial qubit types. Discussion focused on the identification of key milestones, metrics and needs for the error correction community to progress over the next five years.

One key ingredient is getting qubits to be better, in isolation and in groups. The fidelity of operations like gates, the ability to reset qubits to ‘zero’, and the means to attain a quick and accurate measurement of a qubit’s final state, were all raised as crucial functionalities to improve. There’s a tremendous amount of engineering and materials science going into these tasks. Although the week did not have a single ‘aha’ moment in qubit performance, the collective gains in the field are impressive.

The other key ingredient is reducing the cost of building and running qubits. This is not just dollars and pounds but also watts and people. Moving from the NISQ approach to quantum error correction changes fundamental architectural decisions and may open new opportunities for better, lower cost systems. On one hand, the number of key operations is lower as universal gates sets are typically finite and do not include parametric gates such as continuous controlled phase gates. On the other hand, QEC requires substantial numbers of measurements in nearly every gate layer, and optimisation towards such high measurement-throughput devices is nascent.

Many participants throughout the week highlighted how control systems, as currently developed, are unlikely to achieve the cost and performance necessary for thousands and millions of qubits. However, these conversations highlighted a key insight that Riverlane has gained: leveraging the semiconductor industry to solve control and decoding computational challenges is expensive on the front-end but will ultimately enable production at scale.

It has been established that the “decoding problem” is a bottleneck for error corrected quantum computation. Solving the classical decoding problem fast enough to not limit the speed of logical operations requires a multi-disciplinary approach to improve algorithms and build dedicated hardware. But speed is not the only important milestone to consider for decoders. There are several other functionalities that a decoder should exhibit in the next five years to support progress towards error-corrected quantum computation.

In fact, an important highlight referenced across multiple conference steams was the emergence of the Riverlane-developed parallel window decoder as a scalable means of leveraging cheap per unit semiconductor hardware in a massively parallel system that scales with the size of your quantum computer. Additional techniques to reduce cost of decoding, from neural network pre-decoders to cryogenic logic systems adjacent to qubits based upon CMOS or more exotic technologies, also highlighted low-hanging fruit for better integration of Quantum Control and Quantum Error Correction systems.

What about the benchmarks and metrics that will drive future progress?

At the Riverlane-sponsored panel on this topic, the overall vision agreed was to not focus on algorithms but instead at the subroutine level. There, specific techniques like syndrome extraction circuits, entanglement distillation, magic state distillation and logical gate sequences can all feed into performance characterisation. Specifically, the benchmarks and metrics should answer the question: “What type of operation sequences stress the hardware system?”

Given these efforts and discussions, where does the field go from here?

Today we are collaborating with error correction experts across the world to develop a whitepaper on milestones for the next five years of error correction. These involve the whole quantum computing community now coming together to address the big challenges ahead:

  • How to make qubits at scale
  • How to integrate control and measurement systems at speed
  • How to decode error information fast enough and cheaply enough to keep up
  • How to track our collective progress to ensure that our outcomes lead to the first error-corrected quantum computers?

The whitepaper that we’ll publish over the coming months will dive into each of these questions.

By Jake Taylor, Chief Science Officer | Kenton Barnes, Senior Quantum Engineer | Rossy Nguyen, Senior Product Manager (Quantum Error Correction)