IBM is modeling the universe with quantum computing
At Think 2026, IBM Research leadership emphasized that useful quantum computing is already here.
Since its inception, IBM Research has been laser focused on developing what’s next in computing. This means shooting for the moon, while being firmly anchored in reality. The problems we want to solve aren’t purely academic; they’re not just thought experiments. Modeling financial systems, simulating the properties of new materials, developing more efficient batteries — these are just a few examples of the pressing, real-world problems that IBM, its partners, and businesses, care about. They also present some of the most difficult computations for classical computers to grapple with.
The answer is to expand our computational toolbox. To do that, it means focusing our energy on developing computing systems that can perform the math to simulate quantum mechanics, which Director of Research and IBM Fellow Jay Gambetta calls “the operating system of the universe.” It might sound lofty, but this vision is firmly in line with what famed physicist Richard Feynman predicted decades before the first quantum computer ever existed: Nature is inherently quantum, and if we want to simulate quantum mechanics, we need to use a quantum computer.
Thanks to the efforts of IBM Research scientists, useful quantum computing is here. For this year’s closing keynote of IBM’s 2026 Think conference in Boston, Gambetta told those in attendance that, instead of announcing what’s coming next, “I’m going to tell you what we already have.”
And that’s exactly what he did, with the help of IBM Research colleagues, and leaders from Cleveland Clinic, Oak Ridge National Laboratory, and Q-CTRL. Each showcased how IBM Quantum computers are making it possible for them to accelerate their work in new ways.
“You’ve seen the technology, but it’s our clients and partners who turn it into something real,” Jamie Garcia, director of strategic growth and quantum partnerships at IBM, said onstage.
Simulating complex molecules
One of the tasks that classical computers struggle with is simulating the electronic structure of molecules. Electrons surround the nuclei of the atoms that form a molecule, as well as the space between them. The more atoms that make up a molecule, the more complex the interactions among electrons — and the harder time a classical computer will have simulating them.
This week, researchers at Cleveland Clinic and RIKEN announced that they had successfully simulated a 12,635-atom protein complex using an algorithm called sample-based quantum diagonalization (SQD) on IBM quantum hardware. Running SQD for this complex simulation requires quantum-centric supercomputing (QCSC), which enables quantum and classical hardware — like RIKEN’s Fugaku supercomputer — to work in tandem, achieving more than either technology could on its own.
“Not only was this the largest quantum-centric simulation to date of a protein-ligand complex, but the team achieved a 210-times accuracy improvement over previous state-of-the-art quantum-centric approaches,” said Jerry Chow, CTO of quantum-centric supercomputing and IBM Fellow. “Where classical methods are struggling, quantum is just getting started.”
Modeling tomorrow’s energy solutions
Scientists at Oak Ridge National Laboratory are hard at work developing an energy production technology that was once the stuff of science fiction: nuclear fusion reactors. Rather than splitting atoms to release energy as existing nuclear fission reactors do, a fusion reactor could theoretically produce energy by binding lighter atoms into heavier ones. Before you can get to that, though, the fuel for a reactor must be carefully produced.
The walls inside a fusion reactor are lined with molten salt: a mixture of fluorine, lithium, and beryllium, known as FLiBe. This molten salt captures neutrons, which turn lithium into tritium, the fuel for fusion reactions. Unfortunately, only a few pounds of tritium are produced globally each year. Before we get to nuclear fusion, the world needs to be able to reliably produce more tritium.
“But getting the chemistry right is hard,” said Chow. “The molten salt mixture has completely different chemistry than proteins — one hard to study classically with high accuracy.” That’s where QCSC comes in.
In an experiment that simulated FLiBe atoms’ free energy, the amount of energy available to perform work at constant temperature, QCSC simulation yielded calculations with a high level of precision that agreed with leading classical methods, without relying on crude approximations. “So what we have here is a path to computing chemically and physically relevant quantities that are measurable in the lab,” Chow said.
These QCSC workflows involve passing results back and forth between quantum and classical hardware, but this can be a slow process. Moving forward, IBM has shared a reference architecture for a vision of QCSC and will work with the community towards a future more tightly integrated computing workflow.
Pushing forward materials science on the IBM Quantum Platform
In a step forward for useful quantum computing, Q-CTRL this week reported it has used the IBM Quantum Platform enhanced with its own performance-management infrastructure software, to complete a commercially relevant materials simulation more than 3,000 times faster than a leading classical method, while maintaining useful accuracy and finishing within practical time constraints. The IBM Quantum Platform gives users access to hardware on the cloud, as well as tools for executing quantum circuits.
It wasn’t just a ‘toy’ problem devised to demonstrate quantum hardware, but a simulation of practical significance for materials science, involving up to 60 interacting electrons — triple the approximately 20-electron ceiling where classical methods can top out.
And whereas Q-CTRL reports that performing this calculation on classical hardware can take 100 hours, their quantum workflow brought that time down to just two minutes, with an accuracy within 1% of a leading classical method.
“Useful quantum is real right now, and we’re seeing interesting work coming in from our clients and partners,” said Gambetta. “And they’re just getting started.”
Related posts
- ExplainerPeter Hess
Where the frontiers of high-speed racing and computing meet
ResearchKim Martineau and Dave MosherBuilding the future of computing, together
NewsPeter HessNext-generation algorithms could move fusion from the lab to the grid
ReleaseKim Martineau