
GM frens, this is the Quantum Doom Clock with Colton Dillion and Rick Carback, the founders of Quip Network, the world’s shared quantum computer.
Happy 2026! Overall, the quantum new year arrived more stable, in more than one way. In practice, molecular qubits are now holding coherence long enough to run real-world chemistry simulations. We are now seeing converging national quantum programs springing up across the globe with government, corporate, and university partnerships focused on initiatives to accelerate and communicate quantum technology and its applications. While investments are not yet at historic levels like fiber, railroads, or AI, they are pouring in to back all of this up. Timelines, too, have collapsed from decades to years before everyone says they’ll have definitive, commercially available, quantum advantage.
For this newsletter, in addition to our regularly scheduled summary of last month’s news and our analysis, we are giving you a bit more: A summary of all of this year’s quantum advantage claims along with our predictions for 2026.
Before that, this month we highlight Scott Aaronson’s post on recent advances in quantum:
This year updated me in favor of taking more seriously the aggressive pronouncements—the “roadmaps”—of Google, Quantinuum, QuEra, PsiQuantum, and other companies about where they could be in 2028 or 2029.
Obviously, we agree.
Five groups advertised quantum supremacy or quantum advantage claims in 2025. While all claims passed peer review or scientific scrutiny to varying degrees, each faces various levels of skepticism, which highlights that the boundary between quantum and classical computational supremacy remains fiercely contested, and we’ve endeavoured to give your our opinion on each.
Notably absent is IBM, who decided to stake the contrarian claim that the quantum computing community has not yet achieved quantum advantage and instead released a quantum advantage tracker on github. This was done 2 months after getting smacked down on bogus HSBC quantum trading advantage claims, which astute readers will note we listed but did not expand on in our October newsletter—at the time, it did not make sense that a classically simulatable trading algorithm would net better trades as opposed to finding a solution more efficiently. The tracker itself is interesting but clearly geared toward their hardware and existing partners, so time will tell how useful it will be.
With that, let’s get into the claims. We numbered them based on what we thought were most significant. Each of the following 6 sections starts with the organization, date, and title of what they did. We then get into the details of how real and responses they received.
In their peer-reviewed paper in Science, "Beyond-Classical Computation in Quantum Simulation", D-Wave claimed its quantum annealer achieved computational supremacy on a practically useful problem, simulating quantum dynamics in programmable spin glasses, completing time evolution of spin glass systems across 2D, 3D, and infinite-dimensional lattices. They claimed speedups of approximately 20 minutes versus one million years on Oak Ridge National Labs Frontier supercomputer.
This claim was highly controversial and faced immediate attempts at refutation, with two main challenges:
Many will also point out that the D-Wave Advantage2 is an annealer, not gate-based. That means it cannot solve the same problems, so many feel it is not really a quantum computer. They are qubits, though, and it is unclear how much of a lift it would be to convert annealing qubits to gate-based, or to close the gap between annealers and adiabatic computers, which are known to be equivalent to gate-based paradigms. Any approach that yields results early is a good approach, so we think it counts.
Our conclusion is that the way they worded the claim, "beyond-classical quantum simulation," is simply too strong. The detractors have a point that the claim of one million years to simulate classically is likely overstated as methods for simulation and hardware only get better over time, not worse. We have not yet seen any attempts that yield better or faster results than D-Wave, however, so this result stands so far.
Google Quantum AI published in Nature what it called the first "verifiable quantum advantage" with its 105-qubit Willow chip implementing their Quantum Echoes algorithm using only 65 of the qubits. This measures Out-of-Time-Order Correlators (OTOC(2)), which is a metric describing quantum information scrambling in highly entangled systems. They claimed to be 13,000x faster than the Frontier supercomputer, with ~2 hours on Willow versus an estimated 3.2 years on the Frontier supercomputer.
Their work includes a molecular structure application demonstrating practical use. Specifically, they computed hydrogen atom distances in molecules containing 15 and 28 atoms, and validated against traditional NMR data, as it would otherwise only be validated with another quantum computer.
Unlike in the past, Google’s quantum supremacy claims are sticking this time. With the NMR validation it looks like Google really is the first gate-based system to stake a reasonable claim to quantum advantage. We think, however, that D-Wave’s results indicate potential for commercial applicability first.
Published as a cover in Physical Review Letters, "Establishing a New Benchmark in Quantum Computational Advantage with 105-qubit Zuchongzhi 3.0 Processor", the authors from University of Science and Technology of China (USTC) claimed speedup 10^15 times faster than Frontier and 10^6 times faster than Google’s Sycamore with the 105 qubit system, using only 83 for their benchmarks.
To our knowledge this is not contested, but Random circuit sampling is just a benchmark, not useful. You can also make claims similar to those against D-Wave in that faster simulations could be made, but it stands in comparison to Sycamore, so the result is legitimate but we wish it was doing something more useful like OTOC from Google.
While not peer reviewed, Quantinuum’s preprint on quantum information supremacy is an interesting read. Using 12 qubits on the H1 trapped-ion system, they show the ability to complete a task that the most memory-efficient classical algorithm requires between 62 and 382 bits of memory.
Memory efficiency is not usually what we mean by quantum advantage, and there is a question when you consider that the quantum computer does need a classical control system, which has its own memory. Should that be counted?
While it is not a speedup or power efficiency, the result derives from fundamental mathematical properties of quantum mechanics. We think that counts as a kind of quantum advantage, as it demonstrates a significant difference with the quantum computing paradigm.
Also notable is that Quantinuum released a preprint about Helios. Helios is a 98-qubit trapped-ion system. The authors include random circuit sampling results demonstrating operation "well beyond the reach of classical simulation," but do not quantify it against other systems.
From a different group out of USTC, their Jiuzhang preprint, “Robust quantum computational advantage with programmable 3050-photon Gaussian boson sampling,” generated boson samples in 25.6 microseconds. This is against a claim that classical simulation would take >10^42 years to produce a tensor network for simulation.
As noted in their paper, previous experiments were challenged by classical algorithms that could reproduce k-photon correlations for small k values. The new system measures correlations up to ~19 photons, which should close this loophole.
The Jiuzhang is not a general purpose gate-based system. While Gaussian Boson Sampling has some applications, they are more limited than even quantum annealers like the D-Wave. That said, the demonstration is one of quantum advantage AND it is photonics based, so it deserves the recognition.
While not "first" for their respective platforms, perhaps most interesting in the real world application sense is the work released by Phasecraft in October. Both of these papers simulated Fermionic dynamics on the Quantinuum H2 and the Willow chips, respectively. Quantinuum’s related Fermi-Hubbard preprint is also notable. Instead of evolving an arbitrary initial state, they try to find a relevant initial state to evolve, which is an even more useful starting point.
The prospect of running the same experiments across devices enables comparison. That is what we’re all about over on the quantum doom clock, so we are excited both to see these and the aforementioned quantum advantage tracker.
We do not have anything to check from last year, so let’s get right into what we expect for 2026:
We did not include a logical or physical qubit count estimate since those are all on the roadmaps by the manufacturers, so it felt too easy! Our most aggressive prediction, by far, is #3. Error correction has come a long way but that would represent a surprising result, especially if it does not come from a trapped ion system like IonQ.
Just in time for Christmas, QuEra’s Gemini quantum computer was announced featuring a 260-qubit neutral atom array with high-fidelity gates and room-temperature operation, advancing scalable fault-tolerant quantum computing. It achieves 99.9% fidelity on global 1-qubit gates, 99.2% on global 2-qubit gates, and SPAM fidelity up to 99.7%. The qubits are all-to-all connected, and while not generally available, a MIT/Harvard team has had a crack at it and QuEra is accepting consultation applications.
In related news, QuantWare’s VIO-40K architecture release is touted to enable mass production of 10,000-qubit quantum processors, marking a 100x leap from current industry standards. We caution that this is mostly a fabrication capabilities announcement, as buried in their data they admit that they do not anticipate any QPUs being released before 2028.
A cluster of quantum computing advancements highlighted significant progress in enhancing quantum system stability, error rates, and scalable control. Researchers at the University of Colorado at Boulder and Sandia National Laboratories developed a chip-scale optical phase modulator nearly 100 times smaller than a human hair. The optical phase modulators they designed “could help unlock much larger quantum computers by enabling efficient control of lasers required to operate thousands or even millions of qubits” and it sports a familiar CMOS process which could lead to much better scaling in photonics platforms.
As for error rate improvements, Sandia National Laboratories developed fast-feedback protocols to enable real-time calibration of 1- and 2-qubit gates, controlling drift and SPAM errors. Karlsruhe Institute of Technology (KIT) and Université de Sherbrooke in Québec developed a method to calibrate qubit charge, reducing readout faults. CSIRO and The University of Melbourne developed a quantum machine learning method that reduces hardware demands using partial error correction, advancing the path to practical quantum-enhanced learning. Macquarie University researchers, led by Dr Christina Giarmatzi fully mapped how quantum noise spreads over time, revealing time-linked errors that form a memory-like pattern undermining computation reliability. Researchers from Pennsylvania State University developed a forensic framework using graph neural networks to predict quantum error rates from public data, enabling performance assessment without confidential calibration information.
Lastly, Atom Computing developed a protocol to reset and reload lost atoms, enabling a self-repairing quantum processor that maintains qubit functionality throughout long computations. They shift the resonance of the unmeasured atoms with laser beams, then use a second set of lasers to reinitialize the system in order to join the next step in the computation. This could be very useful for long lived computations.
Quantum network integration is accelerating through photonic and telecom-compatible innovations. Gianvito Chiarella and his colleagues at the Max Planck Institute of Quantum Optics developed a node-heralding strategy that boosts quantum communication efficiency by reducing delays and photon loss errors. Researchers at the University of Chicago Pritzker School of Molecular Engineering developed a new method enabling quantum communication over 1,243 miles, theoretically extending quantum networks to 2,000 km.
In terms of actual results, researchers demonstrated a multiplexed quantum network node over 12 km of fiber. Aegiq Ltd. and the University of Sheffield achieved deterministic generation of complex entangled states using redundant encoding of Greenberger-Horne-Zeilinger states, boosting fusion success rates beyond 50% for scalable photonic quantum computing, and US-Based researchers announced an erbium-based molecular qubit that enables quantum systems to interface with standard fiber-optic networks using telecom wavelengths for light-based control and readout of magnetic quantum states.
Have a question? Just e-mail us at team at quantumdoomclock dot com. We did not get any this month, but when you have them, send them our way!
The Quantum Doom Clock is brought to you by Richard Carback and Colton Dillion, the cofounders of Quip Network
The Quantum Doom Clock is a monthly mailing list that summarizes news for Quantum Computing and its effects on the cryptography and cryptocurrency spaces. We do not sell your e-mail.
