IBM Unlocks Quantum Utility With its 127-Qubit “Eagle” Quantum Processing Unit

A team of IBM researchers working with the University of California, Berkeley and Purdue University has successfully extracted useful quantum computing from one of today’s NISQ (Noise Intermediate Scale Quantum) computers. The team used Eagle, one of IBM’s newest quantum processing units (QPUs), to perform computations such as: expected to fail In the midst of qubit noise. However, by using a clever feedback mechanism between his 127-qubit Eagle QPU at IBM and supercomputers at UC Berkeley and Purdue, IBM managed to prove You can get useful results out of a noisy QPU. The door to quantum utility is open and much earlier than expected.
NISQ-era quantum computers are built into standard supercomputers, the most powerful machines known to man, capable of trillions of operations per second. They are powerful, but it is a universal truth that when two subjects are roped together, they can only move as fast as the slowest subject will allow. And the supercomputer was already thinly scaled for this experiment, using advanced techniques to accommodate the complexity of the simulation.
When qubit simulations became too complex to simply “brute force” the results on a supercomputer, researchers at the University of California, Berkeley began using compression algorithms, or tensor network states. These tensor network states (matrices) are essentially data cubes, and the numbers that make up the computation are represented in three-dimensional space (x, y, z), allowing more complex information relationships and quantities than typical 2D solutions. can be processed. – Consider a simple Excel 2D table (x, y) and how many more rows you need to search in that configuration if you need to consider another plane of information (z).
“The crux of this work is that we can now run fairly large and deep circuits using all of Eagle’s 127 qubits, and get numerically correct results.”
Kristen Temme
This means that there is already utility that can be extracted from the NISQ quantum computer. Anything that can produce results that are out of reach on a standard supercomputer, at least in terms of time and money, and getting those results would be more effort than profit.
Currently, there is a back and forth between NISQ-era quantum computers with (at most) hundreds of qubits and the solutions provided by standard supercomputers with trillions of transistors. As the number of useful qubits available increases, the deeper depth circuits used in the paper will be explored. As the number and quality of qubits increase, standard supercomputers must also catch up, processing numbers and examining the queue of quantum computing results as deeply as possible.
“It immediately points to the need for new classical techniques.” Anand said. And they are already considering those methods. “We are currently applying the same error mitigation concept to classical tensor network simulations to see if we can get better classical results.”
Essentially, the more accurately we can predict how noise evolves in a quantum system, the better we can know how that noise can compromise correct results. The way to learn to predict something is simply to poke it and watch what happens so you can identify the levers that move it.
Some of these levers have to do with when and how you activate the qubits (some circuits use more qubits, others use more or less of those qubits). placed in quantum gates, requiring more complex entanglements between specific qubits…) IBM researchers had to learn. By moving each of these knobs within his Quantum Eagle of 127 qubits, you can see exactly what noise and how much. Because if you know how to introduce noise, you can control it. If we understand how it manifests itself in the first place, we can explain it, and thus prevent it or take advantage of it.
But if you have a noisy computer doing just the calculations, how can you tell if the calculations are correct?
The IBM team had access to two supercomputers: the National Energy Research Scientific Computing Center (NERSC) at Berkeley National Laboratory and the NSF-funded Anvil supercomputer at Purdue University. These supercomputers compute the same quantum simulations that IBM performed on his 127-qubit Eagle QPU. Calculations are split as needed and done in a way that both results from the supercomputer can be compared. We now have the ground truth, a solution that has been achieved, verified, and known to be correct by a standard supercomputer. The light turns green and compares the noisy and correct results.
“IBM asked if we would be interested in undertaking this project, knowing that our group specialized in the computational tools needed for this type of experiment.” Says Sajant Anand, a graduate researcher at the University of California, Berkeley. “I thought it was an interesting project at first, but I didn’t expect the result to be like this.”
The rest is ‘just’ a matter of solving the ‘spot the difference’ puzzle. Once you know how exactly the presence of noise skews your results, you can correct for the presence of noise and gather the same “ground truth” as the presence of noise. Standard supercomputer results. IBM calls this technique Zero Noise Extrapolation (ZNE).
This is a symbiotic process. His IBM team behind the paper is also looking to bring its error mitigation techniques (and the equivalent of zero-noise extrapolation) to standard supercomputers. Between the increases in raw power due to modern hardware developments and algorithmic and technical optimizations (such as the use of smart compression algorithms), raw supercomputing power will increase, and quantum computing will continue to work towards the post-era. can be verified only slightly. -NISQ quantum computer and its quantum error correction deployment.
That’s when the rope breaks, and Quantum requires relatively less classical methods to verify results. This is what slows down quantum computing (aside from the lack of error correction that allows the qubits themselves to perform computations, of course).
In an interview with Tom’s Hardware for this article, Dr. Abhinav Kandala, Manager of Quantum Capabilities and Demonstrations at IBM Quantum, said brilliantly:
“…even with a noisy version of that state, we can measure what the properties of that state would be like in the absence of the noise.”
Dr. Abhinav Kandala
With the exception of quanta, problem complexity can be increased beyond what a supercomputer can handle. It also correctly models the impact of noise on the system, so you can perform cleanup procedures on noisy results with some confidence. The farther away we get from the “definitively true” results provided by a standard supercomputer, the more likely we are to introduce catastrophic errors into our computations that are not (and could not) be accounted for in the noise model. .
But while the results are credible, they actually provide useful quantum processing capabilities beyond what can be achieved with current-generation classical Turing machines like the Berkeley supercomputer. He’s also now surpassed what was thought possible with NISQ (Noisy Intermediate Stage Quantum)-era computers. And, coincidentally, many algorithms designed for near-future quantum devices can fit within his 127 qubits in IBM’s Eagle QPU, allowing circuit depth beyond his 60 steps of quantum gates. increase.
Dr. Khandala added:: “What we’re doing with error mitigation is running short-depth quantum circuits and measuring so-called expectation values to measure properties of states. What people want to do with quantum computers That’s not all, what I’m saying is unlocking all the features.” Potentially you need quantum error correction, and to do something useful, you need an error-corrected quantum The general feeling was that if you don’t have a computer, you can’t access it.
“The key was being able to manipulate noise beyond pulse stretching.” Dr. Khandala said. “Once it started working, it allowed us to do more complex extrapolations that could suppress the bias due to noise in ways we couldn’t before.”
ZNEs have the potential to become a staple of any quantum computing approach. Error mitigation is an essential requirement for the error-prone NISQ computers in use today, and may even be necessary when error correction thresholds are reached. A qubit whose function is related to correcting errors in the computation of other qubits.
The work IBM has done here is already influencing the company’s roadmap. ZNEs have the compelling quality of creating better qubits from qubits that can already be controlled within a quantum processing unit (QPU). Better performance (less noise) with no extra logic, just like more megahertz. You can be confident that these lessons have been considered and implemented as much as possible towards “1 million + qubits”.
Also, it is difficult to ignore how this work shows that there is really no competition between quantum and classical. The future is certainly Fusion, and it’s time to play a little game with old AMD motos. In Fusion, specific computing elements serve specific processing needs. Each problem, no matter how complex, has its own tools, from classical to quantum. And human ingenuity requires us to make full use of everything and demonstrate our superior abilities.
The proverbial rope between standard supercomputers and quantum computers is only as long as it gets, but IBM is finding more clever ways to stretch it. Thanks to this research, quantum computers are already starting to look a little further. Perhaps Dr. Khandala will see what he wants sooner than he expected. The playground for quantum utilities has opened ahead of schedule. Let’s see what humans can do in it.