Guest post by Yuval Boger, Chief Commercial Officer of Quera Computing
Quantum Error Correction (QEC) is an important area of quantum computing that aims to protect quantum information from inherent errors that occur during calculations. Quantum systems are highly susceptible to noise and decoherence, so effective QEC technology is important for achieving scalable and reliable quantum computing technologies. In 2024 we witnessed a huge shift in focus, from counting physical qubits to implementing and enhancing more logical qubits. What will be achieved in 2024 and what can we expect in 2025?
Historical context
In the 1990s, pioneers such as Peter Scholl, Andrewsteen, and Daniel Gotzmann laid the foundations for quantum error correction by adapting classical error correction principles to the quantum realm. Introduced in 1995, Shor’s groundbreaking code demonstrates how to encode a single qubit across multiple physical qubits, fixing bit-flip and phase-flip errors without breaking down quantum states I made it possible.
At the heart of QEC is the challenge of achieving fault tolerance. This prevents quantum computation from remaining reliable even when errors exist for any component of the system. Fault tolerance relies on the threshold theorem which states that error rates can be effectively corrected as long as they fall below a certain threshold. This principle underpins the continued development of QEC strategies designed to detect and correct errors faster than they accumulate.
Important parameters for comparing different QEC codes
Various QEC codes have unique properties that affect effectiveness and practicality in a variety of quantum hardware architectures. The key parameters used to compare them are:
Code Distance (d): Represents the robustness of quantum error correction codes by measuring the minimum number of physical qubit errors required to corrupt logical Qubit. For example, a code with D=3 can fix a single error. D=7 can fix three errors, while D=2 can detect a single error, but not correct.
Qubit Overhead: The number of physical qubits required to encode a single logical Qubit.
Connection requirements: The code must be maintained by each Qubit, whether the code only requires local (closest nighbor) or long distance interactions between Qubits, as well as the required degree Number of connections.
Threshold Error Rate: The physical error rate in which logic errors are exponentially suppressed by increasing code distance.
-Accessible logic gates: Easily implement fault-resistant logic gates (including non-Clyford gates).
– Measurement and deciphering complexity: the number of syndrome measured and the amount of classical calculations required to decipher errors in real time.
2024: The Dawn of the Logical Quabit Era
Several commercial and academic groups have shown impressive results with quantum error correction. Key publications include:
Advances in surface code and logic chuckbit manipulation
A Harvard-led publication, “Logical Quantum Processors Based on Reconfigurable Atom Arrays,” (here), has strengthened its focus on logic kitbits. We demonstrate improvements in the 2 quit logic gates of surface codes by scaling the code distance from d = 3 to d = 7, preparing fault-resistant logic GHz states, computationally max. 48 logic kibits A complex sampling circuit has been realized. This study introduces unique features for neutral atomic computers, such as reconfigurable connectivity and parallel multi-kit operations.
Google has published “quantum error correction below the surface code threshold” (here) and describes the new superconducting willow chip. Google has demonstrated a sub-threshold error correction scheme for Willow. The physical error rate is low enough, so adding QUBITS to QEC will reduce the logic error rate rather than amplifying the error. This is an important milestone in QEC.
Fault tolerance innovation in various qubit modalities
IBM has published “High Freezing and Low Overhead Fault Resistant Quantum Memory” (here). This presented the total code of the superconducting qubit. , assume a physical error rate of 0.1%.
Microsoft and Quantinuum have released a “Demonstration of Quantum Computation and Error Correction with Tesseract Codes” (here) on Quantinuum’s 56-kit Trap Ion Computer. They demonstrated a logic kitbit of 12 with an error rate of 0.0011. This is 22 times better than the corresponding circuit error rate of 0.024 for the physical chrysanthemum.
Separately, Microsoft and Atom Computing published “Logical Computations Proven on Neutral Atom Quantum Processors” (here) demonstrating 24 and 28 logic kits on Neutral Atom Computers.
AWS and others have published “Hardware-efficient quantum error correction using a concatenated boson kit” (here).
Algorithm fault tolerance and distillation of magical states
Queera-led paper, “Algorithm Fault Tolerance for Fast Quantum Computing,” (here) presents a new fault tolerance strategy, lateral algorithm fault tolerance. Decoding, built on Harvard-driven paper, correlated logic algorithms with cross-gate decoding” (here). This is in contrast to traditional QEC methods that require repeated syndrome extraction for each logical operation, which is significantly slower than physical clock speeds (often 30 times the associated regime) It brings clock speed.
The Quera-led team issued an “Experimental Demonstration of Logical Magical State Distillation” (here), achieving a key milestone in large quantum computing: Logical Magical State Distillation (MSD) . Magical State Distillation is the foundational building block of large quantum computers. Stabilizer states and Clifford’s operations are often easily implemented on error-corrected quantum computers. However, such states can be classically simulated efficiently and are not sufficient for universal quantum computation. This is where the magical state appears. “Magic” explains how far the quantum state is from the stabilizer state. Magical State Distillation prepares high fidelity magical resource states by purifying multiple low fidelity ones. This work demonstrated logic-level MSD using 2D color codes with D = 3 and D = 5 codes.
Professional techniques and high fidelity operations
Quantinuum includes “high fidelity teleportation of logical kits using cross-gate and lattice surgery” (here) and “The Logical 3 quit Quantum Fourier Transform Encoded into Steane Code of Trap Ion Quantum Computer.” There were several notable papers, including benchmarks. (here).
Beyond quantum computer vendors
QEC is more than just a quantum challenge. It relies heavily on high-performance classic computing to detect, decode and correct errors in real time.
One approach to doing this is to use an FPGA or an ASIC. For example, Yale’s “FPGA-based distributed union find decoder for Surface Codes” (here) or “here) (here) “Demonstrate real-time and low-latency quantum error correction” from Riverlane and co-authors. Please refer to it.
Alternatively, some people think that GPUs are more appropriate at this stage. This is because it offers high performance parallelism and flexibility.
New directions and strategies
Despite these advances, this field faces major challenges, particularly when expanding the number of physical qubits required to represent logical qubits. Furthermore, achieving target logic error rates remains a critical hurdle as achieving target logic error rates is a one-in-a-million rate to facilitate practical quantum applications. is. In the future, several directions will be investigated in 2025.
Exploring logical algorithms: Rather than focusing solely on code construction, researchers are currently implementing logical algorithms in real hardware. This allows for empirical insights such as improving decoders, improving fault-tolerant gate designs, and utilizing noise characteristics. Initial results suggest that gates were performed laterally, especially on reconfigurable platforms such as neutral atoms, which can significantly enhance the performance of the algorithm.
Many codes: Surface codes remain the mainstay of its high threshold (~1%), but alternatives such as color codes and highly rated QLDPC codes can potentially be lower overhead or simpler logic gates traction has been gained through implementation. As hardware diversity expands, it is possible that defecation kitbits, trap ions, neutral atoms, photonics – codes that do not differ may find special niches.
Noise tailing: Not all errors are created equal. A practical QEC can benefit from modeling specific error channels, particularly bias, elimination, or photon loss, to adjust correction strategies. Hardware tracking these different noise processes can return that information to the decoder, increasing logical fidelity.
Machine Learning in QEC: “Artificial Intelligence for Quantum Error Correction: A Comprehensive Review” (here) For a comprehensive review of QEC’s AI, see (here). Machine learning techniques are used to accelerate decoding algorithms, optimize stabilizer measurements, and adapt QEC strategies in real time. This can prove to be invaluable for managing large Qubit arrays where manual tuning is unrealistic.
The new era of QEC
What QEC once considered a distant challenge is now the most active frontier on the field. The transition from experimental demonstrations to truly scalable and fault-resistant quantum computations seems increasingly feasible. As logical qubit counts and quality improve, we approach our goal of leveraging quantum technology in real applications
Acknowledgements: Harry Zhou, Quantum Error Correction Architecture Lead at Quera Computing, provided very helpful comments and suggestions on this article.