IBM Quantum has entered the age of quantum utility—our quantum processors can now provide useful results to problems that challenge the best scalable classical methods. Now, we need to get utility-scale processors into the hands of our users.

What do we mean? Well, for the first time, we’re making utility-scale processors available on the IBM Cloud to access with a pay-as-you-go plan. Formerly reserved for our premium plan clients, the 127-qubit IBM Quantum Eagle is the first IBM Quantum processor to demonstrate evidence of quantum utility. Users are no longer limited by the scale of our 27-qubit processors and can begin to use processors to run calculations competitive with state-of-the-art classical methods.

Using error mitigation to outperform supercomputers

But what do we mean by quantum utility and utility-scale processors? Recently, IBM Quantum and the UC Berkley published a paper that compared an IBM Quantum Eagle processor with error mitigation to supercomputers running state-of-the-art classical computing methods to calculate expectation values for a condensed matter physics problem. Not only did the quantum processor provide accurate answers, but it performed better than the scalable classical methods for certain complex quantum circuits.

These results came from a combination of IBM Quantum hardware expertise and error mitigation—methods that use classical processing to remove noise from a quantum computation. We extended a previously investigated error mitigation method called Zero Noise Extrapolation with noise model learning that we explored in another error mitigation technique called Probabilistic Error Cancellation to return highly accurate expectation values. The Qiskit Runtime programming model and architecture allows IBM Cloud users to run primitive programs called sampler and estimator with built-in error mitigation, so all of our users will be capable of performing similar calculations on their own.

Everyone can take advantage of 100+ qubit systems

This change is part of IBM Quantum’s broader plan to transform our fleet to consist solely of utility-scale processors, starting with processors of 127 qubits or more. This is just the first step in that direction for our IBM Cloud offering as we continue to further add 127-qubit systems in the foreseeable future, while also sunsetting several 27-qubit systems. By ensuring our users have access only to the highest-performing systems, we can work together to push the field forward and develop use cases in this new era of quantum utility.

Get started and run your first job on a utility-scale quantum processor today on the IBM Cloud

More from Cloud

Apache Kafka use cases: Driving innovation across diverse industries

6 min read - Apache Kafka is an open-source, distributed streaming platform that allows developers to build real-time, event-driven applications. With Apache Kafka, developers can build applications that continuously use streaming data records and deliver real-time experiences to users. Whether checking an account balance, streaming Netflix or browsing LinkedIn, today’s users expect near real-time experiences from apps. Apache Kafka’s event-driven architecture was designed to store data and broadcast events in real-time, making it both a message broker and a storage unit that enables real-time…

Primary storage vs. secondary storage: What’s the difference?

6 min read - What is primary storage? Computer memory is prioritized according to how often that memory is required for use in carrying out operating functions. Primary storage is the means of containing primary memory (or main memory), which is the computer’s working memory and major operational component. The main or primary memory is also called “main storage” or “internal memory.” It holds relatively concise amounts of data, which the computer can access as it functions. Because primary memory is so frequently accessed,…

Cloud investments soar as AI advances

3 min read - These days, cloud news often gets overshadowed by anything and everything related to AI. The truth is they go hand-in-hand since many enterprises use cloud computing to deliver AI and generative AI at scale. "Hybrid cloud and AI are two sides of the same coin because it's all about the data," said Ric Lewis, IBM’s SVP of Infrastructure, at Think 2024. To function well, generative AI systems need to access the data that feeds its models wherever it resides. Enter…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters