Earl Campbell
21 Jul 2023
Quantum decoders detect and correct the vast flood of data errors unique to quantum computers.
Quantum error correction is quantum computing’s defining challenge. It is a set of techniques used to protect the information stored in qubits from errors and decoherence caused by noise.
Quantum decoders rely on quantum error correction codes. These codes use multiple physical qubits, each with high error rates to encode information in a single ‘logical’ qubit, enabling the identification of errors through a decoding process.
This increases the logical fidelity (or accuracy) making the computation more reliable. However, most fast decoders neglect important noise characteristics (where noise is the interference leading to data errors). And ignoring these noise characteristics reduces the accuracy of the resulting calculation.
In the paper, we introduce quantum decoders that are both fast and accurate, which can also be used with a wide class of quantum error correction codes. This new set of decoders lowers the bar required for qubit quality. In other words, these quantum decoders make the qubits in a quantum computer look better by making them appear to produce fewer errors.
After the preprint of our paper was posted on arXiv, the Google Quantum AI team used our belief-matching decoder in their landmark paper demonstrating quantum error correction. Belief-matching was an important component in Google’s experiment, as it was the only efficient decoder capable of suppressing errors in their device.
This was an amazing project to work on. It began during my time at AWS and finished after I started at Riverlane.
You can read the full paper Improved decoding of circuit noise and fragile boundaries of tailored surface codes here.