Quantum Unfiltered #7 — The Advantage Arrives (For Some Industries Sooner Than You Think)
Quantum utility claims are stacking up. Here's what the evidence actually supports.
A quick note: you may have noticed a new name. The Quantum Observer is now Quantum Unfiltered. The change is partly practical — a new quantum media property is launching at quantumobserver.com in the coming days, and I wanted a clean separation — but it is also editorial. The new name better reflects what this newsletter actually does: unfiltered analysis, no vendor cheerleading, no doomsaying. Now, back to work.
In this edition: I spent quite a few weeks mapping every peer-reviewed fault-tolerant quantum algorithm to its real-world application, hardware requirements, and timeline. The result, my seven-part Quantum Utility Map Deep Dive, reveals a utility ladder where some industries reach quantum advantage at hardware levels that are almost here, while others need machines further down the roadmap. Crucially, the cryptographic threshold sits near the bottom of that ladder, not the top. This edition unpacks that thesis and examines why a cluster of recent results from Q-CTRL, IBM, QuEra, IonQ, and others is validating exactly the pattern the research predicted. Plus: Cisco builds quantum networking infrastructure, the error correction revolution keeps compressing timelines, and a Bitcoin conference presents pseudoscience claiming quantum computers are ontologically impossible. That last one earns this edition's Quantum Flapdoodle.
The Quantum Utility Ladder — and Where Your Industry Sits on It
I published one of the most ambitious project I’ve attempted on PostQuantum.com: The Quantum Utility Map, a seven-part Deep Dive mapping every major fault-tolerant quantum algorithm resource estimate to the real-world problem it solves. Hundreds of papers, weeks of research, over 20,000 words across seven articles. The picture that emerges is more nuanced than either the optimists or the pessimists want to hear.
The utility picture is a ladder, not a switch. Some industries will reach clear quantum advantage at hardware levels that are close. Others will need to wait a few years longer for the return on investment to materialize. A common assumption, that you can wait for fault-tolerant quantum computers to start reshaping industries before worrying about the cryptographic threat, turns out to be exactly backwards.
At the lower rungs, condensed-matter physics and quantum chemistry reach utility first. Useful simulations of photosensitizer molecules (relevant to cancer therapy) become tractable at around 180 logical qubits. Battery material simulation works at fewer than 500. The hardware for these applications is coming in the next few years, and the Fermi-Hubbard model (the workhorse of condensed matter physics) is already being demonstrated at meaningful scale on today’s noisy hardware. More on that below.
In the middle, the grand challenge chemistry problems occupy the 1,000–5,000 logical qubit range. FeMoco (nitrogen fixation), cytochrome P450 (drug metabolism), ruthenium catalysts (CO₂ conversion). These are the applications that will reshape pharmaceutical R&D, chemicals, and advanced materials — but they need larger, more mature fault-tolerant machines that are further out on the roadmap.
For finance, logistics, and machine learning, the picture is different (but “different” does not mean “disengage.”) The proposed quantum speedups in these sectors are typically quadratic, which means error correction overhead eats the advantage when competing against highly optimized, massively parallelizable classical systems. The math does not close today, and closing it will take larger machines, better algorithms, or both, which puts these sectors a few rungs higher on the ladder. Algorithmic discovery is unpredictable; a genuine breakthrough in quantum optimization would rewrite the analysis. These sectors should continue developing their capabilities, maintain a monitoring capability and track the signals that would change the assessment, rather than either betting the strategy on quantum or walking away.
Here is the finding that should reframe the conversation for many CISOs: the CRQC threshold, the point at which a quantum computer can break today’s public-key cryptography, sits near the bottom of the utility ladder. Breaking RSA-2048 requires roughly 1,399 logical qubits. Breaking 256-bit elliptic curve cryptography requires roughly 1,200. Most of the grand challenge chemistry applications require more. The CRQC arrives before the transformative science, not after. If you are waiting to see quantum computers deliver industrial value before starting your PQC migration, the science tells you that by the time they do, your encryption will already be vulnerable.
The Evidence Is Arriving
Within days of publishing the initial version of the Utility Map, three results landed that read like a controlled experiment validating the thesis. (And then I updated my Utility Map)
Q-CTRL’s Fermi-Hubbard simulation is the strongest. Two minutes and forty-six seconds of QPU time on IBM’s 156-qubit Heron processor to simulate 60 interacting electrons across a one-dimensional chain, using 120 qubits and over 9,000 two-qubit gates. The best classical alternative (ITensor’s TDVP solver) needed over 100 hours. A 3,000-fold wall-clock speedup.
This is the largest and most accurate gate-based digital simulation of the Fermi-Hubbard model reported to date, and Q-CTRL’s compilation pipeline is the unsung enabler. By reducing circuit depth by over 80% and gate count by over 60%, they ran the widest simulation at 9,057 two-qubit gates across 152 layers. Separately, the study's deepest circuits used 62 qubits with 13,829 two-qubit gates across 452 layers. Whether you call it “quantum advantage” depends on where you draw the definitional line: Q-CTRL compared against the best available tool, not the best tool that could theoretically exist. That debate will play out in the literature. What matters for the Utility Map is the application: Fermi-Hubbard is one of the strongest near-term candidates for useful quantum computation that my research identified. Q-CTRL just demonstrated it at scale on commercial hardware.
On the chemistry side, Cleveland Clinic, RIKEN, and IBM simulated the electronic structure of a 12,635-atom protein-ligand complex, a 40-fold increase in system size over a simulation performed just four months earlier. Two Heron r2 processors, up to 94 qubits, 9,200 circuits, 1.3 billion measurement outcomes. This doesn’t yet outperform classical methods for protein chemistry. But the trajectory (303 atoms to 12,635 atoms in four months, with a 210-fold accuracy improvement on a key step) tracks the kind of exponential scaling that precedes competitive advantage. The quantum processor handles the electron correlation clusters, which is exactly the quantum-hard part of the problem that the Utility Ladder identifies as the natural domain for quantum advantage.
IBM’s CEO put a timeline on it. Arvind Krishna told investors in early May that partners will demonstrate the first examples of quantum advantage this year. This is one of IBM's clearest public timelines yet for quantum advantage, and the Q-CTRL and Cleveland Clinic results explain why they are willing to make it. The quantum-centric supercomputing pipeline, pairing Heron QPUs with Fugaku, is delivering real results on real problems.
Notice the pattern. Every result that holds up to scrutiny is a physics or chemistry simulation. None is a finance optimization or a logistics routing problem. The utility ladder thesis is playing out in real time.
Why the Timelines Are Compressing
The other major finding from the Utility Map is that error correction is advancing faster than hardware. Three advances in particular are compressing the physical-to-logical qubit ratio by an order of magnitude or more: qLDPC codes, magic state cultivation, and algorithmic fault tolerance. Recent results push this even further.
QuEra, Harvard, and MIT published a preprint demonstrating in circuit-level noise simulations that ultra-high-rate qLDPC codes can achieve a 2:1 physical-to-logical qubit ratio. The standard surface code requires roughly 1,000 physical qubits per logical qubit. A 2:1 ratio enters territory where the overhead almost vanishes. QuEra’s neutral-atom architecture has the native long-range connectivity that qLDPC codes demand, which makes this more than a theoretical exercise. If realized on hardware, applications I estimated would need tens of thousands of physical qubits might need a fraction of that.
IonQ is pursuing the same opportunity from a different modality. Their 110-page “Walking Cat” fault-tolerant blueprint claims 110 logical qubits from 2,514 physical qubits using qLDPC codes on trapped ions. Trapped ions have native all-to-all connectivity within a trap zone, which is exactly what qLDPC’s non-local stabilizer checks require. Combined with IonQ’s recent photonic interconnect demonstration linking two systems via entangled photons, the building blocks for modular fault-tolerant computing are being assembled in parallel.
Meanwhile, Harvard’s cascade neural decoder achieves 10⁻¹⁰ logical error rates on qLDPC codes, addressing one of the practical barriers to their adoption: decoding fast enough for real-time correction.
Every order-of-magnitude improvement in encoding efficiency pulls the entire utility ladder closer. The first rung, useful quantum chemistry at ~180 logical qubits, could arrive much earlier than current roadmaps suggest. And since the CRQC threshold sits near the bottom of that same ladder, the security implication is clear: the preparation window for PQC migration is shorter than most organizations assume.
Cisco Builds the Quantum Plumbing
One item that doesn’t fit neatly into the advantage or error correction narratives but matters for the long game: Cisco introduced a universal quantum switch that routes qubits across different modalities at room temperature over standard telecom fiber, with roughly 4% fidelity loss.
When the largest networking company on the planet builds a working prototype for quantum routing, it signals where the infrastructure investment is headed. Scaling to the logical qubit counts needed for industrial chemistry and the CRQC threshold will likely require modular, networked architectures. Cisco just built a piece of that puzzle.
Quantum Flapdoodle: When a Bitcoin Conference Tries to Abolish Quantum Mechanics
At Bitcoin 2026 in Las Vegas, the world’s largest Bitcoin conference gave its main stage to Jeff Booth, a venture capitalist with no background in physics, to present a thesis drawn from a roughly 220-page unreferenced paper. The argument: Bitcoin’s 10-minute block interval constitutes empirical proof that time itself is discrete rather than continuous. Because quantum mechanics assumes continuous time, quantum mechanics must be wrong. Therefore quantum computers cannot threaten Bitcoin. The presentation was delivered with absolute confidence to an audience of roughly 30,000.
Let that sink in. A payment network’s engineering parameter, is being presented as a discovery about the fundamental nature of reality that falsifies a century of physics. I read the paper. Bitcoin's own internals contradict the premise. Block discovery is modeled as a stochastic Poisson process with exponentially distributed waiting times, not as a deterministic ten-minute tick. Block timestamps are integer Unix-epoch fields constrained by consensus rules, not evidence for a fundamental chronon. The difficulty adjustment recalculates every 2,016 blocks to keep average production near ten minutes. None of this says anything about the ontology of time. Bitcoin runs on the very continuous-time mathematics the paper claims to have disproven. The paper contains no bibliography and no peer review. Its treatment of Shor’s algorithm amounts to arguing that if Shor’s worked, you could “run it on Bitcoin” by creating contradictory transactions in the mempool, confusing ECDSA private key recovery with the consensus mechanism.
It gets better. A follow-up “Declaration of Chronology” renames quantum mechanics to “Block-Wave Dynamics,” declares Bitcoin “the only logical and principled quantum computer,” and issues a “Call to the Miners” arguing that if global hash rate approaches 10⁴³ per second, Bitcoin would be “probing the substrate of time itself.”
This would be merely entertaining if the stakes were lower. Some fraction of those 30,000 attendees left believing the quantum threat is a philosophical error. That fraction represents real money in addresses with exposed public keys and real organizations delaying migration decisions. For the full analysis, see The Anatomy of Quantum Denial on PostQuantum.com.
From the Applied Quantum Desk
The Quantum Utility Map series represents the kind of research-driven analysis Applied Quantum brings to its advisory work. If you are evaluating which quantum use cases matter for your sector, building a readiness strategy, or navigating the PQC migration timeline, my team and I can help. We provide board briefings, quantum readiness assessments, vendor due diligence, and hands-on migration support.
If this was useful, forward it to a colleague who should be paying attention to quantum. If you spotted an error, hit reply — I read everything and correct publicly. And if you have thoughts on the new name, I’d like to hear those too.
— Marin



Infleqtion has not mentioned use of new error correction codes but one would think logical qbit advancements would be available to their neutral atom modality. Curious