Scientists at the Institute of Science Tokyo have announced a breakthrough in quantum error correction that could bring a large-scale quantum computer closer to reality.
The team has developed a new class of quantum low-density parity-check (LDPC) error correction codes that perform close to the hashing bound, the theoretical efficiency limit for quantum error correction.
“Our quantum error-correcting code has a greater than 1/2 code rate, targeting hundreds of thousands of logical qubits,” explained Kenta Kasai, Associate Professor and the main lead on the project.
“Moreover, its decoding complexity is proportional to the number of physical qubits, which is a significant achievement for quantum scalability,” he added.
Quantum computers have long promised to revolutionize fields like quantum chemistry, cryptography, and large-scale optimization. However, their progress has been stunted due to the fragile nature of qubits.
Usually, qubits tend to lose their state quickly and have short coherence times. Furthermore, operations like gates and measurements introduce high error rates. The current quantum error correction methods require thousands of physical qubits to create just one logical qubit.
The new LDPC codes are designed to handle hundreds of thousands of qubits. They have a high coding rate, meaning fewer physical qubits are wasted in creating a logical qubit.
This combination of efficiency and scalability could make millions of logical qubits possible, marking a key step towards solving real-world problems.
To read more, click here.