What Is This?
Almost every popular explanation of quantum computing gets the same thing wrong. They say quantum computers are like normal computers but much, much faster — that instead of trying one answer at a time, they try all answers simultaneously, so hard problems that would take a classical computer millions of years take seconds.
This is not what quantum computers do. It is a description that sounds plausible, explains nothing about why quantum computing is actually interesting, and sets up a systematic misunderstanding of where quantum computers are useful and where they're not.
A quantum computer is not a faster classical computer. It is a fundamentally different type of machine that is useful for a specific and narrow class of problems, largely useless for most problems you'd actually want to solve, and genuinely transformative for the specific problems it can address.
What a quantum computer actually does:
Classical computers store information as bits — 0 or 1. Quantum computers store information as qubits — quantum two-level systems that can exist in a superposition of 0 and 1 simultaneously. This is not a metaphor. It is a physical fact about quantum mechanical systems.
When you compute with qubits in superposition, you are not "trying all answers simultaneously" in any simple sense. You are performing a computation on a quantum state that encodes a complex probability amplitude for every possible combination of 0s and 1s. The challenge — and the engineering art — is designing algorithms that manipulate these probability amplitudes so that, when you measure the final state, the probability of the correct answer is high and the probability of wrong answers is low.
The key operations are superposition (qubit in combination of 0 and 1), entanglement (qubits linked so that measuring one instantly determines the state of the other, regardless of distance), and interference (manipulating probability amplitudes to make correct answers more probable and wrong answers less probable).
The measurement collapses the quantum state to a definite answer. You then run the computation again — potentially many times — to build up statistical confidence in the result.
Why this only helps for specific problems:
For most computational problems — sorting a list, running a database query, rendering a video, serving a web request — quantum computers offer no advantage over classical computers. These problems don't have the structure that quantum algorithms can exploit. Classical computers are better at them: faster, cheaper, more reliable, no need for cryogenic cooling.
Quantum algorithms offer provable speedups for a specific set of problems:
- Integer factoring (Shor's algorithm): factoring a 2048-bit number that would take a classical computer longer than the age of the universe takes a quantum computer polynomial time. This is the cryptography threat.
- Search in unsorted databases (Grover's algorithm): square-root speedup. Meaningful, not world-changing.
- Quantum simulation: simulating quantum mechanical systems — molecular interactions, chemical reactions, materials physics — where the problem is inherently quantum.
- Certain optimisation and sampling problems: quantum annealing, variational quantum algorithms — the speedup is contested and problem-specific.
The important list: almost everything else. Machine learning, image recognition, language models, web serving, database operations, most scientific computing — no demonstrated quantum advantage.^1
Why Does It Matter?
- Shor's algorithm breaks RSA and ECC — the encryption protecting almost all internet traffic. RSA (used in HTTPS, SSH, TLS) and Elliptic Curve Cryptography (used in Bitcoin, most TLS deployments) derive their security from the computational hardness of factoring large numbers. A sufficiently large quantum computer running Shor's algorithm can factor those numbers efficiently, breaking the encryption completely. Every encrypted communication is crackable. Every digital signature is forgeable. Every Bitcoin private key is derivable from the public key.^2
- Harvest now, decrypt later is already happening. Nation-state adversaries (primarily China, Russia, and US intelligence services) are almost certainly storing encrypted internet traffic now, planning to decrypt it when quantum computers are capable enough. Your encrypted medical records, financial transactions, and classified communications that exist today will be readable in 10-20 years if they're being archived now. This is not speculation — it is the rational strategy for any adversary with a long time horizon, and signals intelligence agencies operate on exactly this time horizon.
- Post-quantum cryptography migration is underway right now — and you need to care. NIST finalised post-quantum cryptographic standards in 2024: ML-KEM (CRYSTALS-Kyber) for key encapsulation, ML-DSA (CRYSTALS-Dilithium) for digital signatures. These are mathematically hard problems that quantum computers cannot efficiently solve — they're based on lattice problems rather than factoring. Chrome and Firefox have already deployed hybrid post-quantum TLS (X25519MLKEM768). The migration to post-quantum encryption for all critical infrastructure is a decade-long engineering project that started in 2024. If you operate infrastructure — APIs, databases, authentication systems — the question of whether your cryptography is post-quantum-safe should be on your roadmap.^3
- Quantum simulation is the near-term application — and it's genuinely transformative for science. The most plausible near-term quantum advantage is simulating quantum systems: molecular interactions, reaction pathways, materials physics. Classical computers cannot efficiently simulate even moderately complex quantum systems because the state space grows exponentially with the number of particles. Quantum computers can simulate quantum systems efficiently because they are themselves quantum systems. Applications: drug discovery (simulating protein-drug interactions at quantum mechanical accuracy), materials design (developing better superconductors, photovoltaics, catalysts), fertiliser production (simulating the nitrogen fixation mechanism in nitrogenase to design catalysts that could drastically reduce the energy cost of artificial nitrogen fixation — the Haber-Bosch process currently uses 1-2% of the world's entire energy supply).
- Current quantum computers are not yet capable of breaking meaningful cryptography. The quantum computers that exist in 2026 have 1,000–5,000 physical qubits with high error rates. Breaking 2048-bit RSA would require roughly 4,000 logical qubits — error-corrected qubits that require approximately 1,000 physical qubits each. That's ~4 million physical qubits with current error correction overhead. The leading companies (IBM, Google, IonQ, Quantinuum) are on roadmaps that project reaching this scale in the 2030s. The threat is real; the timeline is roughly a decade.
Key People & Players
Peter Shor (MIT) — Developed Shor's algorithm in 1994, proving that quantum computers can factor integers in polynomial time. The single most important result in quantum computing — it transformed the field from theoretical curiosity to strategic national security concern. Before Shor, quantum computing was interesting. After Shor, every major intelligence agency needed to pay attention.^4
Lov Grover (Bell Labs) — Developed Grover's algorithm in 1996: a quantum algorithm providing a quadratic speedup for searching unsorted databases. Less dramatic than Shor's, but more broadly applicable. Also proved to be asymptotically optimal for its class of problems.^5
John Preskill (Caltech) — Coined the term "quantum supremacy" (now often called "quantum advantage"), founded the field of quantum error correction, and writes the most authoritative technical blog on quantum computing (Quantum Frontiers). His lecture notes on quantum computation are the standard graduate-level introduction.^6
David Deutsch (Oxford) — Proposed the first quantum algorithm (Deutsch's algorithm, 1985) and the theoretical model of the quantum Turing machine. His foundational work established quantum computation as a field and, importantly, was motivated by his Many Worlds interpretation of quantum mechanics — he argued that quantum computers work by performing computations in parallel across multiple branches of the wave function.
Google Quantum AI / IBM Quantum / Quantinuum / IonQ — The four primary companies pushing toward fault-tolerant quantum computation. Google claimed "quantum supremacy" in 2019 (a task that would take classical computers 10,000 years took their 53-qubit Sycamore processor 200 seconds — subsequently contested). IBM has the most developed public cloud quantum access programme. Quantinuum (formerly Cambridge Quantum) focuses on trapped-ion qubits with higher fidelity.
The Current State
Quantum computing is in the NISQ era (Noisy Intermediate-Scale Quantum) — devices with 50-5,000 qubits that are too error-prone for fault-tolerant computation but potentially useful for specific tasks. The theoretical "quantum advantage" for practical problems in this regime has not been convincingly demonstrated beyond demonstration benchmarks.
The key milestones ahead:
- Logical qubit demonstration at scale: error-corrected logical qubits that perform arbitrary long computations reliably. Google demonstrated a single logical qubit in 2023. IBM and Quantinuum are on similar trajectories. The key threshold: logical error rate below 1 in 10^6 operations.
- Quantum advantage for useful problems: demonstrating a speedup on a problem of commercial or scientific value, not just a specially constructed benchmark.
- Cryptographically relevant quantum computing (CRQC): the threshold at which quantum computers can break 2048-bit RSA. Current estimates: 2030–2040.
Post-quantum migration is the most urgent near-term action:
- NIST PQC standards are finalised (ML-KEM, ML-DSA, SLH-DSA)
- Chrome and Firefox deploying hybrid post-quantum TLS
- The US government has mandated post-quantum migration for federal agencies
- If you use long-lived secrets (certificate authorities, document signing keys, authentication tokens for critical infrastructure), the migration should be active, not planned
Best Resources to Learn More
- Preskill's lecture notes on quantum computation (Caltech) — The authoritative technical reference. Free, comprehensive, graduate-level.^7
- Quantum Computing: An Applied Approach by Jack Hidary — The most practical introduction to actually writing quantum programs.^8
- Scott Aaronson's blog: Shtetl-Optimized — The most reliably accurate public commentary on quantum computing claims and hype. Aaronson has debunked more quantum computing misinformation than anyone else.^9
- NIST Post-Quantum Cryptography standards — The practical implementation resource for post-quantum migration.^10
- Q is for Quantum by Terry Rudolph — Unusual visual approach to quantum mechanics fundamentals. Best for building intuition before tackling algorithms.^11