Marco Ghibaudi, Riverlane
EE Times 19 Mar 2025
Quantum computing has had a turbulent few months. Google’s Willow chip saw stock markets surge, and then Jensen Huang’s comments that useful quantum computing is 15-30 years away saw a subsequent retraction.
As with most things, the situation is much more nuanced than it seems. A recent report explains how quantum “usefulness” is an incremental process—and while Huang could be correct that the true economic impact of quantum will take circa 20 years, we will start to see quantum computers surpassing classical ones in the next few years (aligning with subsequent comments from Bill Gates).
Yet, despite significant advancements and press attention, quantum computing remains hampered by a fundamental challenge: errors. It is only once we correct these errors that we can start to unlock the power of quantum computing.
The error problem
Qubits (quantum bits) are the building blocks of quantum computers, but they are prone to noise. The slightest environmental disturbance can render them unable to run any useful quantum calculations.
Today’s best quantum computers suffer from high error rates—of the order of one error in every few hundred operations.
For context, classical computers boast error rates far below one in a trillion. This disparity is not merely a minor inconvenience; it renders many ambitious quantum algorithms completely infeasible. Simply improving the quality of individual qubits, while crucial, is insufficient to bridge this gap without scientific breakthroughs.
The error rate needs to decrease to one in a million to unlock even basic applications, and a reduction to one in a trillion is required to access the transformative potential of quantum computing.
This is where quantum error correction (QEC) comes in. But it is no easy task—QEC is arguably quantum computing’s defining challenge.
What is QEC?
QEC employs many techniques to protect the information stored in qubits from errors and decoherence. It leverages redundancy—encoding information across multiple physical qubits—to create a single, more robust “logical qubit.” If one physical qubit experiences an error, the others can compensate, ensuring the integrity of the encoded information. This process is far from straightforward.
The inherent challenge lies in the nature of quantum mechanics itself. Unlike classical bits, which can be directly measured without disturbing their state, measuring a qubit collapses its wave function. This means direct error detection, which is at the core of classical error correction techniques, is not possible.
QEC cleverly circumvents this limitation by measuring collective properties of groups of qubits, extracting information about the presence of errors without revealing the actual encoded data. Special “QEC codes” define these measurements and the subsequent error correction strategy.
Given a set of measurements indicating the presence of errors, a quantum decoder must identify the most likely error pattern and correct it.
This decoding problem is computationally intensive, even for relatively small systems. As the number of qubits scales, the decoding task quickly becomes massively complex, demanding a specialized approach that goes beyond general-purpose processors.
The complexity of scaling
As quantum computers scale, an additional layer of classical QEC solutions, including a decoder, is required to tackle the increasing error numbers and types, addressing three broad challenges:
- Code development: Designing sophisticated QEC codes capable of detecting and correcting various error types demands immense mathematical and computational resources. The codes must be robust against different error mechanisms, and the resulting algorithms must be efficient enough for practical implementation.
- Speed requirements: QEC decoding must occur with remarkable speed; delays can lead to uncorrectable errors. Real-time decoding is crucial for each logical operation, implying extremely high bandwidth requirements—on the order of 100TB/s. In simple terms, that is equivalent to a single quantum computer processing and correcting Netflix’s total global streaming data every second.
- Data volume handling: The volume of data involved in QEC decoding is substantial, scaling exponentially with the number of qubits. Processing this deluge of data in real time introduces significant hardware and algorithmic hurdles.
Error suppression and mitigation
While QEC is critical, it is not the only approach to handling errors in quantum computers. Quantum error suppression (QES) and quantum error mitigation (QEM) are alternative strategies.
QES focuses on improving the quality of individual qubits, aiming to minimize the occurrence of errors. QEM employs classical techniques to reduce the impact of errors on the computation.
While both approaches have their place, neither offers the exponential error suppression that QEC provides. QEM suffers from a classical computational cost that scales exponentially with the number of qubits, making it impractical for large-scale systems.
The path forward
The current landscape of quantum computing clearly shows the importance of QEC. A significant majority of leading quantum hardware companies are actively investing in QEC research and development, highlighting the recognition of its centrality to achieving fault-tolerant quantum computing.
This widespread adoption signifies a critical shift away from merely targeting physical qubits toward the development of high-fidelity logical qubits, which QEC is essential for achieving.
In conclusion, QEC is challenging but vital to building quantum computers capable of outperforming classical computers and realizing the technology’s revolutionary possibilities.
The sooner we can crack QEC, the sooner we will see useful quantum computing.