Read: How quantum mechanics could be even weirder
Quantum computers have been under development for decades. While ordinary, or classical, computers perform calculations using sequences of bits composed of 1s and 0s, quantum computers encode information using quantum bits, or qubits, that behave according to the strange rules of quantum mechanics. Quantum computers aim to harness those features to rapidly perform calculations far beyond the capacity of any ordinary computer. But for years, quantum computers struggled to match the computing power of a handheld calculator.
In 2012, John Preskill, a theoretical physicist at the California Institute of Technology, coined the phrase quantum supremacy to describe the moment when a quantum computer finally surpasses even the best supercomputer. The term caught on, but experts came to hold different ideas about what it means.
And that’s how you end up in a situation where Google says it has achieved quantum supremacy but IBM says it hasn’t.
Before explaining what quantum supremacy means, it’s worth clarifying what it doesn’t mean: the moment a quantum computer performs a calculation that’s impossible for a classical computer. This is because a classical computer can, in fact, perform any calculation a quantum computer can perform—eventually.
“Given enough time … classical computers and quantum computers can solve the same problems,” says Thomas Wong of Creighton University.
Instead, most experts interpret quantum supremacy to mean the moment a quantum computer performs a calculation that, for all practical purposes, a classical computer cannot. This is the crux of the disagreement between Google and IBM, because “practical” is a fuzzy concept.
In its Nature paper, Google claims that its Sycamore processor took 200 seconds to perform a calculation that the world’s best supercomputer—which happens to be IBM’s Summit machine—would need 10,000 years to perform. That’s not a practical time frame. But IBM now argues that Summit, which fills an area the size of two basketball courts at the Oak Ridge National Laboratory, in Tennessee, could perform the calculation in two and a half days.
Read: The computers of our wildest dreams
Google stands by its 10,000-year estimate, though several computer experts interviewed for this article said IBM is probably right on that point. “IBM’s claim looks plausible to me,” Scott Aaronson, a professor at the University of Texas at Austin, said in an email.
So assuming IBM is right, is two and a half days a practical amount of time? Maybe it is for some tasks, but certainly not for others. For that reason, when computer scientists talk about quantum supremacy, they usually have a more precise idea in mind.
Computer scientists distinguish between programs that run in “fast” polynomial time and “slow” exponential time. Fast programs remain fast even when you ask them to chew on a really big number. Slow programs grind down quickly as the size of the problem you’re asking them to solve gets bigger.