Error-correction feat shows quantum computers will get more accurate as they grow larger.
Researchers at Google have built a chip that has enabled them to demonstrate the first ‘below threshold’ quantum calculations — a key milestone in the quest to build quantum computers that are accurate enough to be useful.
The experiment, described on 9 December in Nature1, shows that with the right error-correction techniques, quantum computers can perform calculations with increasing accuracy as they are scaled up — with the rate of this improvement exceeding a crucial threshold. Current quantum computers are too small and too error-prone for most commercial or scientific applications.
Google uncovers how quantum computers can beat today’s best supercomputers
“This has been a goal for 30 years,” said Michael Newman, a research scientist at Google’s headquarters in Mountain View, California, at a press conference announcing the feat. The achievement means that by the end of the decade, quantum computers could enable scientific discoveries that are impossible even with the most powerful supercomputers imaginable, said Charina Chou, the chief operating officer of Google’s quantum-computing arm. “That’s the reason we’re building these things in the first place,” Newman added.
“This work shows a truly remarkable technological breakthrough,” says Chao-Yang Lu, a quantum physicist at the University of Science and Technology of China in Shanghai.
Delicate states
Quantum computers encode information in states that can represent a 0 or a 1 — like the bits of ordinary computers — but can also use infinite possible combinations of several 0s and 1s. However, these quantum-information states are notoriously delicate, explains Julian Kelly, a physicist at Google who leads the company’s quantum-hardware division. To get a quantum computer to perform useful calculations, “you need quantum information, and you need to protect it from the environment — and from ourselves, as we do manipulations on it”, he says.
Aiming for such protection — without which quantum computing would be a non-starter — theoreticians began in 1995 to develop ingenious schemes for spreading one qubit of information across multiple ‘physical’ qubits. The resulting ‘logical qubit’ is resilient to noise — at least on paper. For this technique, called quantum error correction, to work in practice, it would be necessary to show that this spreading of information over multiple qubits robustly lowered error rates.
Over the past few years, several companies — including IBM and Amazon’s AWS — and academic groups have shown that error correction can produce small improvements in accuracy2,3,4. Google published a result in early 2023 using 49 qubits in its Sycamore quantum processor, which encodes each physical qubit in a superconducting circuit.
Google uncovers how quantum computers can beat today’s best supercomputers
The company’s new chip, called Willow, is a larger, improved version of that technology, with 105 physical qubits. It was developed in a fabrication laboratory that Google built at its quantum-computing campus in Santa Barbara, California, in 2021.
As a first demonstration of Willow’s power, the researchers showed that it could perform, in roughly 5 minutes, a task that would take the world’s largest supercomputer an estimated 1025 years, says Hartmut Neven, who heads Google’s quantum-computing division. This is the latest salvo in the race to show that quantum computers have an advantage over classical ones.
Sources from: NATURE.COM