The path to those million qubits is being paved by dramatic improvements in error correction. IBM and partners demonstrated the power of partial fault tolerance in Nature Communications, using the Iceberg error detection code to improve QAOA performance for problems with up to 20 logical qubits on trapped-ion hardware. This represents the largest universal quantum computing algorithm protected by error detection on practical applications to date.
Meanwhile, Xanadu achieved a significant milestone by generating photonic qubits on an integrated chip platform—the first-ever demonstration of such qubits on a chip, with their experiments showing critical features necessary for fault tolerance. Yahoo Finance notes this validates a key pillar of bosonic architectures for photonic quantum computing.
The race for better qubits is also producing stunning results. Caltech researchers led by Professor Manuel Endres demonstrated "hyper-entanglement" in neutral atoms by simultaneously controlling their motion and internal energy states—a first for massive particles. This breakthrough expands the information capacity per atom and introduces motion as a controllable quantum resource.
The crown jewel improvement here comes from Oxford. As New Scientist reports, Molly Smith, Aaron Leu, and Mario Gely created a single-qubit gate that only produces errors once in about 10 million cases. As Leu notes: "The probability of being struck by lightning in a year is about three times higher than the probability that this qubit makes an error." When individual operations become this reliable, the path to fault-tolerant quantum computing becomes dramatically clearer. That kind of reliability is on par with classical computers!
The exponential scaling we have been tracking on the quantum doom clock continues relentlessly. This month brought a cascade of announcements that would have seemed like science fiction just years ago.
IBM dropped perhaps the most notable announcement regarding their path to IBM Quantum Starling—projected to be the first large-scale, fault-tolerant quantum computer. Their breakthrough research charts a viable path to systems running 200 logical qubits by 2028. This timeline is pretty close to Oxford Quantum Circuits, who is also targeting 200 logical qubits by 2028 and then upwards of 50,000 logical qubits by 2034, requiring several million physical qubits using their dual rail ‘Dimon’ technology.
As for less ambitious timelines, Nord Quantique promises a 1,000-qubit machine by 2031 that they claim could make traditional HPC obsolete, with the ability to crack RSA-830 in an hour while slashing energy use by 99%. Xanadu’s vision offers a different form of scale: a quantum computing data center by 2029 with thousands of 24 qubit quantum server racks networked together for fault-tolerant, universal quantum computing.
While these promises excite, actual deployed systems are also showing steady progress:
The pattern is clear: qubit counts are scaling exponentially, exactly as our quantum doom clock predicted. More importantly, these aren’t just press releases—they’re actual systems being built, shipped, and put into production.
As we have said before, the quantum computing industry is experiencing a fundamental shift from research curiosity to commercial reality. This month’s developments show quantum technology moving decisively into the marketplace. Here’s the rundown on the latest developments:
MicroStrategy’s Michael Saylor dismissed quantum concerns on CNBC’s Squawk Box as "mainly marketing from people that want to sell you the next quantum yo-yo token." His confidence that even legitimate quantum threats from tech giants "won’t be released" stands in stark opposition to everything we are about and are tracking. We think this plays out quite a bit differently.
This dismissal comes even as BlackRock updated their Bitcoin ETF prospectus to explicitly include quantum computing as a risk factor. When the world’s largest asset manager acknowledges the threat while Bitcoin’s loudest advocate dismisses it, investors face a confusing landscape.
Indeed, even academic perspectives are shifting. Whereas the teams from EPFL and the Flatiron Institute initially pushed back with a classical simulation that replicated aspects of D-Wave’s results from last month using a variational Monte Carlo method, those solutions fell far short of overturning D-Wave’s results. The real-world implications of that are now becoming clearer and we are continuing to see more useful applications emerge as a result.
Even long-time skeptics are beginning to acknowledge the substance of recent developments. Perhaps most telling is the shift from long-time quantum skeptic Sabine Hossenfelder. Her recent explainer video acknowledges that "a recent series of quantum computing developments might mean that commercially useful quantum computers could be just a few years off."
Coming from someone who has consistently deflated quantum hype, her assessment that quantum computers are "becoming dangerous faster than we thought" carries significant weight. When skeptics start taking the threat seriously, it’s time for the crypto community to move beyond denial.
Q: Is the million-qubit threshold for breaking RSA really that much closer than 20 million?
A: Yes, and it’s not just about raw numbers. Gidney’s revision shows that improved algorithms can dramatically reduce hardware requirements. Combined with this month’s error correction breakthroughs and Oxford’s 1-in-10-million error rate, the path to cryptographically relevant quantum computers is shortening from multiple angles simultaneously.
Also, it is worth noting that we have been predicting this publicly since the beginning of the year.
Q: How can companies claim to compete with such different qubit counts? Is China really ahead with 1,000 qubits?
A: Qubit count alone is misleading—quality matters more than quantity. A high-fidelity 100-qubit system with good error correction can outperform a noisy 1,000-qubit system.
On its face, it is hard to evaluate many of these systems as they lack publicly verifiable testing. You can focus on metrics like quantum volume, error rates, and actual demonstrated applications rather than raw counts, but it is really impossible to say without hard results.
Q: Should I actually worry about my Bitcoin/crypto given Saylor’s dismissal?
A: When BlackRock adds quantum risk to official documents while active developers create defensive protocols (QRAMP, BIP-360, etc -- see our previous newsletters), dismissing the threat seems unwise. The question is not whether quantum computers will threaten current cryptography, but when. Starting migration to post-quantum protections before it is urgent is prudent risk management.
Have a question? Just e-mail us at team at quantumdoomclock dot com. Below are the top questions we have received since our last update.
The Quantum Doom Clock is brought to you by Richard Carback and Colton Dillion, the cofounders of Quip Network
The Quantum Doom Clock is a monthly mailing list that summarizes news for Quantum Computing and its effects on the cryptography and cryptocurrency spaces. We do not sell your e-mail.
