Quantum computers are still a nascent technology, but researchers are busy building complex machine learning algorithms to test the capabilities of quantum learning. Sometimes, however, their algorithms hit a mysterious dead end; a mathematical path from which there is no way forward or backward — the dreaded barren plateau.
“Understanding the problem of barren plateaus was regarded as the key to unlocking quantum machine learning, and our team has worked on this for five years,” said Marco Cerezo, the Los Alamos team’s lead scientist. “The major issue was that we knew these barren plateaus existed, but we didn't really know what unified all the sources of this phenomenon. But what we have done now is characterize, mathematically, why and when barren plateaus occur in variational quantum algorithms.”
Rather than ones and zeros, as conventional computers use, quantum computers use qubits, which take advantage of quantum phenomena such as superposition — an ability to exist in several states at the same time. It is believed that quantum computers, when paired with quantum algorithms, will help humanity solve certain types of problems that were previously too difficult, or took too long, for conventional computers.
Barren plateaus were a little-understood but common problem in quantum algorithm development. Sometimes, after months of work, researchers would run their algorithm and it would unexpectedly fail. Scientists had developed theories as to why barren plateaus exist and had even adopted sets of practices to avoid them. But no one knew the underlying cause of this mathematical equivalent of a dead end.
With a precise characterization of barren plateaus, scientists now have a set of guidelines to follow when creating new quantum algorithms. This will prove essential as quantum computers scale in power, from a maximum of 65 qubits three years ago to computers with more than 1,000 qubits in development today.
This breakthrough — a unified theory of barren plateaus — removes the guesswork of building quantum machine learning algorithms that can lead to wasted time and resources. It also removes one of the largest challenges facing the field of quantum computing.
To read more, click here.