At Riverlane, we partner with leading quantum hardware companies, university labs and governments to design and build the quantum error correction layer for all quantum computers.

The pace of our research and development efforts is fast and July was a particularly busy month with five research papers published in arXiv, awaiting peer review.

We wanted to celebrate the team’s efforts and share a quick round-up of last month’s research.

Belief propagation as a partial decoder

Quantum decoders detect and correct the vast flood of errors unique to quantum computers. They’re a fundamental component of the quantum computing stack and vital to unlocking useful quantum computing.

In this paper, we present a new decoding method that accelerates the decoding cycle, by splitting the error correction into two stages. The first stage uses belief propagation as a partial decoder to correct errors that occurred with high probability. In the second stage, a conventional decoder corrects any remaining errors.

Our results found that this method decreases bandwidth requirements, increases speed, and improves logical accuracy.

Neural network decoder for near-term surface-code experiments

There are many different quantum error correction codes and types of quantum decoders. Quantum error correction codes essentially allow us to correct specific (and error-inducing) interactions in a quantum computer.

Each quantum error correction code and decoder type possesses its own unique benefits and drawbacks. The surface code is a popular error-correcting code and neural network decoders can achieve low logical error rates, compared to conventional decoders.

This paper investigates how a neural network decoder performs on the surface code, revealing that neural network decoders are potentially well-suited for near-term demonstrations in quantum computing. This work was done in collaboration with Delft University of Technology.

Compilation of a simple chemistry application to quantum error correction primitives

Part of Riverlane’s work includes developing the quantum algorithms to instruct a quantum computer on what operations and calculations to perform. This application layer sits above the quantum error correction layer, which we call Deltaflow.OS.

This step-by-step guide explains how to set up and compile a simple quantum algorithm to simulate a hydrogen molecule on a fault-tolerant quantum computer.

This is an important proof-of-principle and includes a resource estimate of how many qubits and error correction rounds would be required, emphasising the need for improved near-term quantum error correction.

The Electronic Structure of the Hydrogen Molecule: A Tutorial Exercise in Classical and Quantum Computation

The simulation of molecules and materials is hugely complex using classical methods. We often reach a computational limit where it takes too long for any meaningful results to be uncovered, leaving scientists to resort to long, real-world test and development lifecycles.

Hydrogen is a simple molecule used to test current quantum algorithms. Such simulations will pave the way toward future breakthrough applications involving much more complex molecules and materials. As quantum algorithms are not there yet, there are many classical models currently in use depending on what you need to do.

This paper is a helpful tutorial comparing different calculations on a simple hydrogen molecule on both classical and quantum computers. It is intended to help beginners understand classical and quantum strategies developed over the years for chemical calculations.

Tangling schedules eases hardware connectivity requirements for quantum error correction

Superconducting quantum computers have a fixed qubit layout and connectivity. This makes it difficult to apply the stabiliser codes required to prevent errors from propagating in large-scale quantum computers. This is because the physical qubits are essentially fixed and cannot protect the logical qubit without changes to the hardware or unrealistic resource overheads.

In this paper, we introduce our tangled syndrome extraction technique. This technique is the first to enable the measurement of the required stabilisers without hardware modification, providing superconducting companies with a realistic route to scale up where they do not need to make significant changes to their existing hardware.