Glossary of Terms
Acqui-Hire as Only Exit
The scenario where a quantum computing startup's primary acquisition value is its team of PhD physicists and engineers rather than its technology or market position, resulting in acquisition prices that represent a loss for investors.
Acqui-hires typically return only a fraction of invested capital because the acquisition price reflects the cost of hiring equivalent talent, not the value of a commercial technology product.
Example: A quantum computing startup that raised $100 million might be acquired for $20-30 million in an acqui-hire, returning less than 30 cents on the dollar to investors.
AGI Hype Comparison
A comparative analysis of quantum computing and artificial general intelligence (AGI) hype, noting similarities in unfalsifiable timelines, media amplification, enormous investment, and the gap between narrow demonstrations and claimed broad capabilities.
The AGI comparison is instructive but imperfect — unlike quantum computing, narrow AI is already commercially successful and improving rapidly, providing a clearer path from current capabilities to broader applications.
AI/ML as Emerging GPT
Artificial intelligence and machine learning as an emerging General Purpose Technology showing early GPT characteristics: broad applicability, rapid improvement, and enabling complementary innovations across many sectors.
AI/ML is the most directly relevant comparison because it competes with quantum computing for some of the same applications (optimization, simulation, drug discovery) while demonstrating clearly superior GPT characteristics and actual commercial returns.
Example: AI/ML generated hundreds of billions in commercial revenue by 2025, transformed multiple industries, and is accessible to any organization with standard computing hardware — contrasting sharply with quantum computing's zero commercial returns.
All Must Happen Together
The critical observation that the required breakthroughs are not independently sufficient — they must all be achieved simultaneously on the same hardware platform for commercial quantum computing to work.
This requirement dramatically reduces the probability of success because individual breakthrough probabilities must be multiplied together, not added.
See also: Joint Probability Problem
Almost All Jobs Are R&D
The observation that essentially 100% of quantum computing positions are research and development roles, with no significant commercial operations, sales, or customer support functions — indicating the technology has no market to serve.
This R&D-only job profile is normal for TRL 2-3 technologies but inconsistent with marketing claims that quantum computing is approaching commercial deployment.
Alternative Tech Investments
The broader set of technology investments that compete with quantum computing for capital allocation, including AI/ML, advanced classical computing, quantum sensing, biotechnology, renewable energy, and space technology.
Evaluating quantum computing against alternative investments is essential for rational capital allocation. The relevant question is not whether quantum computing could eventually work, but whether it offers better risk-adjusted returns than alternatives.
Anchoring Bias
The tendency to rely too heavily on the first piece of information encountered (the "anchor") when making judgments. In quantum computing, early theoretical possibilities anchor expectations despite subsequent evidence of practical limitations.
Shor's algorithm (1994) serves as a powerful anchor that shapes all subsequent quantum computing evaluation. The theoretical promise of exponential speedup anchors expectations far above what has been demonstrated in practice.
Example: Hearing that "quantum computers can break all encryption" anchors thinking around quantum computing's power, even after learning this requires millions of error-corrected qubits that do not exist.
Applying Skepticism Broadly
The principle that the critical thinking tools developed for evaluating quantum computing claims should be applied to all technology investment and policy decisions, not just quantum computing.
This course uses quantum computing as a case study, but the analytical frameworks — base rates, expected value, bias recognition, systems thinking — are universally applicable. Students who master these tools gain capabilities that extend far beyond this single topic.
Atomic Clocks
Timekeeping devices that use quantum transitions in atoms (typically cesium or rubidium) as their frequency standard, achieving accuracies that neither gain nor lose a second in millions of years. Already a mature, commercially successful quantum technology.
Atomic clocks demonstrate that quantum technology can be enormously valuable without requiring superposition of large numbers of qubits, error correction, or cryogenic cooling. They work with quantum properties of individual atoms at well-defined energy transitions.
Example: The GPS satellite system depends on atomic clocks accurate to within nanoseconds. Without quantum-based atomic clocks, GPS would accumulate positioning errors of kilometers per day.
Authority Bias
The tendency to attribute greater accuracy to the opinions of authority figures. In quantum computing, the endorsement of Nobel laureates, tech CEOs, and government officials lends unwarranted credibility to optimistic projections.
Authority bias is especially problematic when the authority's expertise is in a different domain than the specific question. A Nobel laureate in condensed matter physics may not have expertise in quantum computing economics or systems engineering.
Example: When a Nobel Prize-winning physicist endorses quantum computing investment, their prestige lends weight to the claim even if their expertise is in a different area of physics.
Autonomous Vehicle Comparison
A comparative analysis of quantum computing and autonomous vehicle hype patterns, noting similar dynamics of overoptimistic timelines, technical barriers underestimated by proponents, and massive investment preceding commercial viability.
Autonomous vehicles provide a recent, well-documented case of technology hype where specific, testable predictions (fully autonomous taxis by 2020) failed to materialize despite billions in investment, and timelines continue to slip.
Example: In 2015, multiple companies predicted fully autonomous vehicles by 2020. As of 2025, Level 4 autonomy exists only in limited, geofenced environments — a trajectory strikingly similar to quantum computing's pattern of retreating timelines.
Balancing Feedback Loop
A causal loop where the output counteracts the input, creating stability or goal-seeking behavior. In the quantum computing ecosystem, balancing loops (market reality, technical failure, investor scrutiny) that should check hype are unusually weak.
The weakness of balancing loops in the quantum computing system is a key insight. Normal market feedback mechanisms (revenue, profitability, customer demand) that would correct overinvestment in most sectors are short-circuited by government funding, corporate strategic positioning, and hype.
Bandwagon Effect
The tendency to adopt beliefs or behaviors because many others have done so. In quantum computing, the participation of Google, IBM, Microsoft, and major governments creates a "too big to question" dynamic.
The bandwagon effect is particularly powerful in quantum computing because prestigious participants lend credibility. "If Google and IBM are investing billions, they must know something we don't" is bandwagon reasoning that substitutes social proof for technical analysis.
Example: When China announced a $15 billion quantum computing initiative, it triggered increased funding from the U.S., EU, and other nations — not because of new technical evidence but because of competitive bandwagon dynamics.
Base Rate of Tech Failure
The historical frequency at which ambitious new technology programs fail to achieve commercial viability. For technologies requiring fundamental physics breakthroughs, the failure rate exceeds 90%.
Base rates provide an essential prior probability for Bayesian reasoning about quantum computing. Before hearing any specific argument about why quantum computing will succeed, the base rate of similar ambitious technology programs suggests approximately 90% probability of failure.
Example: Of the major physics-based technology programs initiated in the past 50 years (fusion, superconducting computing, molecular electronics, etc.), fewer than 10% have achieved commercial viability.
See also: Base Rate Reasoning, Bayesian Reasoning Basics
Base Rate Reasoning
The practice of starting with the historical frequency of similar events (the base rate) before incorporating specific case information, as recommended by Bayesian statistics. The base rate for ambitious physics-based technologies achieving commercial viability is roughly 10%.
Base rate reasoning is perhaps the single most powerful corrective to overoptimistic technology forecasting. Starting from "90% of similar technologies fail" and then asking "what specific evidence suggests this case is different?" produces far more accurate estimates.
Example: Before considering any specific argument about quantum computing's potential, note that the base rate of revolutionary physics-based technologies achieving commercial viability is roughly 10%. Any honest probability estimate should start from this baseline.
See also: Bayesian Reasoning Basics
Bayesian Reasoning Basics
An approach to probabilistic reasoning where prior beliefs are updated with new evidence using Bayes' theorem: \(P(H|E) = \frac{P(E|H) \cdot P(H)}{P(E)}\), where \(P(H)\) is the prior probability, \(P(E|H)\) is the likelihood of evidence given the hypothesis, and \(P(H|E)\) is the updated probability.
Bayesian reasoning provides the mathematical foundation for rational belief updating about quantum computing. Starting with a prior probability (from base rates), each new piece of evidence should systematically adjust the estimate up or down.
Example: Starting with a 10% prior probability of commercial quantum computing by 2035, observing that no quantum computer has demonstrated commercial advantage by 2025 should lower this probability, while a genuine fault-tolerant demonstration would raise it.
Bias in Tech Forecasting
The systematic overoptimism in technology forecasting driven by forecaster incentives, anchoring to best-case scenarios, neglect of base rates, and the human tendency to extrapolate current trends linearly.
Technology forecasting in quantum computing is doubly biased: the general bias toward optimism in all technology forecasting is compounded by the specific financial incentives of the quantum computing ecosystem.
Board-Level QC Questions
A set of questions that corporate board members should ask when presented with quantum computing investment proposals, designed to cut through hype and surface the actual risk-return profile.
Board members typically lack the technical background to evaluate quantum computing claims independently. These questions are designed to be askable by non-experts while effectively probing the investment's validity.
Example: Key board-level questions include: "What specific problem will this solve that we cannot solve classically?" "What is our exit strategy if the technology does not mature?" "What is the expected value of this investment using pessimistic probability estimates?"
Breaking the Hype Loop
Potential interventions that could weaken the reinforcing loops sustaining quantum computing overinvestment, including mandatory technical audits, standardized benchmarking against classical baselines, and reformed incentive structures for researchers and forecasters.
Breaking the loop requires introducing or strengthening balancing feedback mechanisms. This is difficult because every actor in the system benefits from its continuation in the short term.
Building a Claims Tracker
The practice of systematically recording specific, testable predictions made by quantum computing companies, researchers, and forecasters, along with their outcomes, to build an empirical database for evaluating forecaster credibility.
A claims tracker converts vague technology optimism into accountable, verifiable statements. It is one of the most practical tools for cutting through quantum computing hype.
Example: Recording "Company X predicts achieving quantum advantage on drug discovery by Q4 2026" and checking the outcome provides concrete data for assessing that company's forecasting credibility.
Career Incentive Loop
The reinforcing feedback loop where quantum computing careers depend on continued funding, which depends on optimistic forecasts, which require active researchers, who need career stability — creating incentives for insiders to maintain optimistic narratives.
This loop explains why insider forecasts are systematically overoptimistic: the forecasters' livelihoods depend on the field continuing. Honest skepticism from insiders is career-threatening, so it rarely emerges publicly.
Catalog of Broken Promises
A systematic compilation of specific predictions, timelines, and performance claims made by quantum computing companies, researchers, and consultants that failed to materialize.
Maintaining a catalog of broken promises provides concrete evidence for evaluating the credibility of new claims. Patterns in past failures — who made them, what incentives existed, how accountability was handled — reveal structural problems in quantum computing forecasting.
Example: Tracking D-Wave's, IBM's, and Google's major announcements from 2007-2025 reveals that virtually every specific performance or timeline claim has been missed or quietly revised.
Causal Loop Diagrams
Visual representations of cause-and-effect relationships in a system, showing how variables influence each other through reinforcing (amplifying) and balancing (stabilizing) feedback loops, typically drawn with arrows and polarity indicators (+ or -).
Causal loop diagrams make the hidden dynamics of quantum computing investment visible, revealing why the system behaves as it does and identifying potential leverage points for change.
Example: A causal loop diagram of quantum computing investment shows that "hype generates funding" which "funds PR and lobbying" which "generates more hype" — a reinforcing loop that operates independently of technical progress.
Charismatic Founder Risk
The investment risk created when a charismatic, persuasive founder substitutes personal conviction and vision for evidence of technical viability, attracting investment based on narrative rather than fundamentals.
The quantum computing ecosystem has several charismatic founders whose personal persuasiveness exceeds their organizations' demonstrated capabilities. Investors should be particularly cautious when the investment case depends more on the founder's vision than on verifiable technical milestones.
Classical AI Hardware
Specialized processors (GPUs, TPUs, custom ASICs) optimized for artificial intelligence and machine learning workloads, representing an alternative investment that competes with quantum computing for some of the same claimed applications.
Classical AI hardware is already delivering the optimization, simulation, and pattern recognition capabilities that quantum computing promises for the future. Investment returns from AI hardware have been extraordinary, providing a concrete opportunity cost comparison.
Example: NVIDIA's market capitalization grew from approximately $150 billion to over $2 trillion between 2020 and 2025, driven by AI hardware demand — a return that quantum computing investments have not remotely approached.
Classical Alternatives Cheaper
The economic reality that for every problem quantum computers have been applied to, classical computing solutions exist that are orders of magnitude cheaper, more reliable, and more accessible.
Until quantum computing can solve a problem that classical computers literally cannot solve at any cost, or can solve a problem dramatically cheaper, there is no economic justification for commercial adoption.
Example: A classical simulation of a small molecule that costs $0.01 on cloud computing would cost thousands of dollars on current quantum hardware, assuming the quantum computation could be completed at all.
Classical Computers Keep Up
The observation that classical computing continues to advance rapidly through hardware improvements (GPUs, TPUs, specialized accelerators), algorithmic innovations, and software optimization, frequently narrowing or eliminating claimed quantum advantages.
Every year that quantum computing fails to deliver, classical computing gets better. This moving-target problem means that the bar for quantum advantage keeps rising, often faster than quantum hardware improves.
Example: When Google claimed quantum supremacy in 2019, IBM quickly argued that improved classical algorithms and storage could reduce the classical computation time from 10,000 years to 2.5 days. By 2022, classical tensor network methods reduced it further.
Classical Computing
Computing based on binary digits (bits) that exist in definite states of 0 or 1, using deterministic logic gates to perform calculations. This includes all conventional computers from smartphones to supercomputers.
Classical computing is the baseline against which quantum computing must demonstrate superiority. Throughout this course, we examine whether quantum approaches can deliver meaningful advantages over classical systems that continue to improve rapidly.
Example: A modern GPU cluster can simulate many quantum circuits with up to 50 qubits, often matching or exceeding the performance of current quantum hardware on practical problems.
Cognitive Bias Overview
Systematic patterns of deviation from rationality in judgment and decision-making, rooted in how the human brain processes information under uncertainty. Cognitive biases are not character flaws — they are features of human cognition that can be identified and mitigated.
Understanding cognitive biases is essential for this course because multiple reinforcing biases combine to sustain quantum computing investment despite consistently negative evidence. Recognizing these patterns is the first step toward making better technology investment decisions.
Coherence Must Improve 100x
The requirement that quantum coherence times must increase by approximately two orders of magnitude to allow deeper quantum circuits and more complex computations.
Coherence time improvement has been incremental over decades. Achieving a 100x improvement may require fundamentally new qubit designs or entirely new approaches to environmental isolation.
Cold Fusion Career Warning
The cautionary example of cold fusion, where initial excitement led researchers to invest their careers in a phenomenon that proved irreproducible, resulting in damaged reputations and wasted scientific effort.
Cold fusion collapsed more rapidly than quantum computing would likely contract, but the career damage to researchers who had committed fully was severe and lasting.
Cold Fusion Losses
The 1989 cold fusion episode where Fleischmann and Pons' claims of room-temperature nuclear fusion triggered worldwide investment that was ultimately lost when the results proved irreproducible.
Cold fusion is a cautionary tale about how scientific claims can attract massive investment before proper verification, and how institutional momentum can sustain investment even after doubts emerge.
Example: Following the 1989 announcement, governments and corporations worldwide invested hundreds of millions of dollars in cold fusion research before the phenomenon was definitively discredited.
Comparable Company Analysis
A valuation method that compares quantum computing companies to similar firms. The challenge is that there are no truly comparable companies because no technology sector has the combination of massive investment, zero commercial returns, and fundamental physics uncertainty that characterizes quantum computing.
Analysts sometimes compare quantum computing companies to early-stage biotech or semiconductor startups, but these analogies are flawed because those sectors have established paths from research to commercialization that quantum computing lacks.
Concorde Economics Failure
The Concorde supersonic airliner as an example of a technology that worked technically but failed economically — too expensive to operate at a profit despite being an impressive engineering achievement.
The Concorde parallel is particularly relevant to quantum computing because it demonstrates that technical success does not guarantee economic viability. Even if quantum computers work, they may be too expensive for any commercial application.
Example: Concorde could fly at Mach 2 but cost 5-10 times more per seat-mile than subsonic alternatives. It operated for 27 years but never achieved profitability, ultimately being retired when economics could no longer be ignored.
Conducting Red Team Analysis
A structured process where a team deliberately argues against a proposed quantum computing investment or strategy, identifying weaknesses, risks, and assumptions that proponents may have overlooked or minimized.
Red team analysis is one of the most effective debiasing techniques because it formally assigns the role of critic, removing social pressure to conform to group optimism. Every significant quantum computing investment decision should include a red team review.
Example: A red team analyzing a proposed $50 million quantum computing investment would systematically challenge each claimed benefit, identify the classical alternatives, calculate expected value under pessimistic assumptions, and present the case for investing elsewhere.
Confirmation Bias
The tendency to search for, interpret, and remember information that confirms pre-existing beliefs while ignoring contradictory evidence. In quantum computing, investors and researchers selectively attend to positive results and dismiss negative findings.
Confirmation bias is reinforced by the structure of the quantum computing ecosystem: conferences feature success stories, journals prefer positive results, and media amplifies breakthroughs while ignoring failures or retractions.
Example: An investor who believes in quantum computing will notice every paper claiming incremental progress while dismissing studies showing no quantum advantage, expert skepticism, or company financial losses.
Connectivity Limitations
The restriction that most quantum hardware architectures only allow interactions between physically adjacent qubits. Performing operations between distant qubits requires SWAP operations that introduce additional errors and slow computation.
Limited connectivity forces quantum algorithms to use extra operations to move information between non-adjacent qubits, increasing circuit depth and error accumulation. This practical constraint reduces the effective power of quantum hardware below theoretical predictions.
Example: On IBM's heavy-hex lattice topology, each qubit connects to at most 3 neighbors. Performing a two-qubit gate between qubits that are 10 positions apart may require 9 additional SWAP operations, each introducing gate errors.
Connectivity Must Improve
The requirement that qubit-to-qubit connectivity must increase significantly from current nearest-neighbor architectures to enable efficient implementation of quantum algorithms that require long-range interactions.
Limited connectivity forces the use of SWAP operations that increase circuit depth and error rates, effectively reducing the computational power of the system below what raw qubit counts suggest.
Consultant Hype Reports
Market research and consulting reports (from firms like McKinsey, BCG, Gartner) that project massive future markets for quantum computing, typically using assumptions that implicitly require all technical barriers to be solved.
These reports are influential with corporate and government decision-makers but rarely disclose the conditional nature of their projections. They function as marketing tools for the consulting firms and as justification for continued investment.
See also: McKinsey $450B Projection
Contrived Benchmarks
Test problems specifically designed to showcase quantum hardware strengths while avoiding its weaknesses. These benchmarks demonstrate quantum capabilities but do not reflect problems anyone needs to solve commercially.
Contrived benchmarks are a form of cherry-picking that inflates perceptions of quantum computing progress. Critical evaluation requires asking: "Would anyone pay to have this specific problem solved?"
Example: Random circuit sampling, boson sampling, and quantum volume tests are all examples of benchmarks designed to favor quantum hardware that have no known commercial application.
Controlling 2^1000 Parameters
Dyakonov's observation that a 1,000-qubit quantum computer's state space has \(2^{1000}\) complex dimensions, and controlling this system with the precision required for useful computation may be physically impossible — not merely technically difficult.
This number (\(2^{1000}\)) is inconceivably large — greater than the number of atoms in the observable universe by hundreds of orders of magnitude. The idea that any physical system can be controlled with sufficient precision across this many dimensions is, at minimum, unproven.
Corporate R&D Burn Rate
The rate at which quantum computing divisions of major corporations consume capital without generating commercial returns, typically tens to hundreds of millions of dollars annually per company.
Corporate burn rates are sustained by the strategic rationale of "staying in the game" and FOMO rather than by near-term commercial logic. Companies fear being left behind if quantum computing does work, even if the probability is low.
Example: IBM, Google, and Microsoft each spend an estimated $100-500 million annually on quantum computing R&D, funded from profitable classical computing businesses.
Cost Must Drop Dramatically
The requirement that the total cost of quantum computation must decrease by several orders of magnitude to compete with classical alternatives on any commercially relevant problem.
Current quantum computing costs are so far above classical alternatives that even significant cost reductions would leave quantum computing uncompetitive for most applications.
Cost Per Computation
The total cost of performing a specific computation on a quantum computer versus a classical computer, including hardware amortization, energy, cooling, error correction overhead, and operational costs.
Cost per computation is the metric that ultimately determines commercial viability. Current quantum computers are estimated to be millions to billions of times more expensive per useful computation than classical alternatives.
Could QC Break Encryption?
A theoretically capable, fault-tolerant quantum computer running Shor's algorithm could break RSA and ECC encryption. However, this requires millions of high-quality physical qubits with error rates far below current capabilities.
The encryption threat is real in principle but wildly overstated in practice. The required quantum hardware is so far beyond current capabilities that the timeline for this threat is measured in decades, if it materializes at all.
Example: Breaking a 2,048-bit RSA key would require an estimated 20 million physical qubits operating with error rates of \(10^{-6}\) — roughly 20,000 times more qubits than exist today, with 1,000 times better error rates.
Critical Thinking Skills
The cognitive abilities required to evaluate claims objectively, including logical reasoning, evidence assessment, bias recognition, probabilistic thinking, and the willingness to update beliefs based on new information.
This course treats critical thinking as a transferable skill set that is valuable far beyond quantum computing. The analytical frameworks taught here apply to any technology investment, policy decision, or scientific claim.
Crossing the Chasm
Geoffrey Moore's framework describing the gap between early adopter enthusiasm and mainstream market adoption, particularly relevant for disruptive technologies that must demonstrate pragmatic value to succeed commercially.
Quantum computing has not yet crossed from the "innovators" to the "early adopters" stage — it has not even reached the chasm, let alone crossed it. The chasm concept highlights that technology enthusiasm does not automatically translate to commercial demand.
Cryogenic Cooling Requirement
The need for most quantum computing platforms to operate at temperatures near absolute zero (typically 10-20 millikelvin), requiring expensive dilution refrigerators that consume significant power and limit system scalability.
Cryogenic requirements add enormous cost and complexity to quantum computing systems, and they do not scale linearly — cooling larger systems requires disproportionately more resources. This is a practical barrier often underestimated in economic projections.
Example: A dilution refrigerator for a quantum computer costs $1-5 million, consumes 10-25 kW of power, and takes days to cool down from room temperature to operating temperature.
Cryogenic Cost and Scale
The economic burden of cryogenic systems, including capital costs ($1-5M per dilution refrigerator), operating costs (electricity, helium), maintenance, and the fundamental engineering challenge of scaling cooling capacity to million-qubit systems.
Current dilution refrigerators have limited cooling power at millikelvin temperatures and limited physical volume for housing qubits and wiring. Scaling to millions of qubits likely requires entirely new cryogenic architectures that do not yet exist.
Example: A single dilution refrigerator can currently house roughly 1,000-5,000 qubits. A million-qubit system might require hundreds of interconnected refrigerators, raising massive integration challenges.
Cryogenics Must Scale
The requirement that cryogenic cooling systems must be redesigned to support million-qubit systems, far beyond the capacity of current dilution refrigerators.
No existing cryogenic technology can cool and house a million-qubit system with the required wiring and control infrastructure. This is not an incremental engineering challenge — it may require entirely new approaches to low-temperature engineering.
Crypto Threat Is Overstated
The assessment that quantum computing's threat to cryptography is exaggerated in both timeline and severity: the required hardware is decades away (if achievable at all), defenses are already being deployed, and most encrypted data loses value long before it could theoretically be decrypted.
The crypto scare serves an important role in the quantum computing hype ecosystem: it provides a national security justification for continued government funding even when commercial applications fail to materialize.
Customer Demand Evidence
The empirical basis (or lack thereof) for claims about commercial demand for quantum computing services, typically extrapolated from surveys, pilot programs, and speculative market sizing rather than actual purchasing behavior.
Genuine customer demand evidence would include signed contracts, purchase orders, or recurring revenue from quantum computations. Virtually none of this exists. What is cited as "demand" is typically curiosity, exploratory budgets, or government-funded research.
D-Wave "Commercial" QC 2007
D-Wave Systems' announcement of the first commercially available quantum computer, a quantum annealer that uses a fundamentally different and more limited approach than gate-based quantum computing.
D-Wave's machines sparked enormous publicity but also controversy. They use quantum annealing rather than universal gate-based computation, are limited to optimization problems, and have never conclusively demonstrated speedup over classical solvers on practical problems.
Example: D-Wave's original 2007 demonstration was widely criticized by academic researchers who questioned whether the device was truly performing quantum computation.
D-Wave Exaggerated Claims
D-Wave Systems' history of making performance claims that exceeded what independent testing could verify, including disputed claims of quantum speedup and commercially misleading marketing.
D-Wave's pattern of exaggerated claims illustrates how commercial pressures in quantum computing incentivize hype over scientific rigor. Their marketing consistently implied broader capabilities than their quantum annealers actually possessed.
Example: D-Wave repeatedly claimed "quantum speedup" in press releases, while peer-reviewed studies found their systems performed no faster than classical solvers on equivalent problems.
D-Wave Revenue Reality
D-Wave Systems' revenue history, which despite being the longest-operating "commercial" quantum computing company (since 2007), has generated modest annual revenue primarily from government grants and consulting rather than quantum computation services.
D-Wave's 18+ year revenue trajectory is the longest available data series for quantum computing commercialization and shows no evidence of an approaching inflection point toward meaningful commercial adoption.
Example: After nearly two decades of operations, D-Wave's annual revenue remains in the range of $5-10 million, primarily from government-funded research contracts rather than commercial quantum computing customers.
D-Wave Sale to Lockheed 2011
Lockheed Martin's purchase of a D-Wave One quantum annealer for approximately $10 million, marking the first sale of a quantum computing system to an end-user organization.
This sale was a marketing triumph but set a troubling precedent: the value to Lockheed was primarily reputational and exploratory, not computational. No documented case of the D-Wave system outperforming classical alternatives on Lockheed's actual problems has been published.
Decoherence Problem
The process by which quantum information is lost as qubits interact with their environment, causing superposition and entanglement to decay. Decoherence times for current hardware range from microseconds to milliseconds — far too short for useful computation.
Decoherence is arguably the single most important barrier to practical quantum computing. Every quantum operation must complete before decoherence destroys the quantum state, imposing severe constraints on circuit depth and algorithm complexity.
Example: A superconducting qubit with a 100-microsecond coherence time can perform roughly 1,000 gate operations before its quantum information is irretrievably lost.
See also: Qubits Are Extremely Fragile, Coherence Must Improve 100x
Dot-Com Bubble Parallel
The comparison between the late-1990s internet investment bubble and current quantum computing investment, highlighting similar patterns of overvaluation, unrealistic projections, and narrative-driven investing.
The dot-com analogy is imperfect — the internet actually worked and had real users — but the investment dynamics are similar: genuine technological potential was used to justify wildly unrealistic valuations and timelines. The key difference is that quantum computing has not yet demonstrated its core technology works at commercial scale.
Example: In 1999, internet companies with no revenue were valued at billions based on "future potential." Similarly, quantum computing companies with no commercial quantum revenue are valued at billions based on projections assuming technical barriers will be overcome.
Dunning-Kruger in QC
The cognitive bias where people with limited knowledge of a subject overestimate their understanding. In quantum computing, non-physicist investors and policymakers often overestimate their ability to evaluate quantum technology claims.
Quantum computing is technically complex enough that superficial understanding can create dangerous overconfidence. Investors who understand the words "superposition" and "entanglement" may believe they understand the technology well enough to assess its viability.
Example: An investor who reads that qubits can "be 0 and 1 simultaneously" may feel they understand quantum computing well enough to invest $10 million, not realizing they lack the physics background to evaluate decoherence, error correction, or scaling constraints.
Dyakonov's Physics Arguments
University of Montpellier physicist Michel Dyakonov's arguments that the continuous-variable nature of quantum states makes quantum computing fundamentally impractical: controlling \(2^{1000}\) continuous parameters with the required precision is physically impossible.
Dyakonov's critique highlights that quantum computing requires manipulating an exponentially large state space with exquisite precision — a requirement that may be fundamentally at odds with the physics of real-world systems.
See also: Controlling 2^1000 Parameters
Each Breakthrough Is Uncertain
The acknowledgment that none of the required breakthroughs for commercial quantum computing is guaranteed by known physics, and several face challenges that may prove to be fundamental physical limitations rather than mere engineering obstacles.
This uncertainty applies not just to timelines but to feasibility. The honest answer for several required breakthroughs is "we do not know if this is physically possible at the required scale."
Each Platform Has Fatal Flaws
The observation that every current quantum computing hardware approach faces at least one challenge that may be insurmountable: superconducting (scaling and coherence), trapped ion (speed), photonic (gate determinism), topological (existence), neutral atom (fidelity).
This pattern suggests that the barriers to quantum computing may be more fundamental than mere engineering challenges. If any platform had a clear, viable path to fault tolerance, resources would concentrate there rather than being spread across five competing approaches.
Electricity as GPT
The electrification of society as a GPT that transformed virtually every industry, improved continuously in generation and distribution efficiency, and enabled countless innovations from lighting to telecommunications to computing.
Like the steam engine, electricity demonstrated GPT characteristics early: it worked reliably, costs declined steadily, and applications proliferated rapidly. Quantum computing shows none of these early-stage GPT indicators.
Energy Consumption Problem
The total energy required to operate a quantum computing system, including cryogenic cooling, classical control electronics, error correction processing, and facility infrastructure. Current small-scale systems already consume tens of kilowatts.
When scaled to commercially relevant sizes, quantum computing energy consumption could rival or exceed that of classical alternatives, undermining the claim that quantum computers will be more efficient for certain problems.
Example: A current quantum computing system with 100 qubits may consume 25 kW of power — roughly the same as 25 household space heaters running continuously — while performing far less useful computation than a standard laptop.
Engineering vs Physics Barrier
The critical distinction between problems solvable through incremental engineering improvement and problems that face fundamental physical constraints. Many quantum computing barriers are physics barriers disguised as engineering challenges.
This distinction matters enormously for investment analysis. Engineering barriers yield to money and effort on predictable timelines; physics barriers may never be overcome regardless of funding. Conflating the two leads to systematically overoptimistic forecasts.
Example: Shrinking classical transistors from 14nm to 7nm was an engineering barrier — difficult but achievable with known physics. Maintaining quantum coherence across millions of qubits may be a physics barrier — no amount of engineering may suffice.
See also: Science vs Engineering Gap
Entanglement Explained
A quantum correlation between two or more qubits such that the quantum state of each qubit cannot be described independently. Measuring one entangled qubit instantly determines the state of its partner, regardless of distance.
Entanglement is the second pillar of quantum computing's theoretical power, enabling coordinated operations across qubits. However, maintaining entanglement across large numbers of qubits is extraordinarily difficult and degrades rapidly with system size — a key engineering barrier.
Example: Two entangled qubits can be perfectly correlated so that measuring one as 0 guarantees the other is 1, but any interaction with the environment can destroy this correlation.
Error Correction Overhead
The massive additional resources required for quantum error correction, where many physical qubits are used to encode and protect a single error-corrected logical qubit. Current leading approaches require roughly 1,000 to 10,000 physical qubits per logical qubit.
Error correction overhead creates a devastating circular problem: you need more qubits to correct errors, but adding more qubits introduces more sources of error. This overhead fundamentally undermines the economic case for quantum computing by multiplying hardware costs by three to four orders of magnitude.
Example: To run Shor's algorithm on a 2,048-bit RSA key, estimates suggest roughly 4,000 logical qubits are needed — which at 1,000:1 overhead means 4 million physical qubits, far beyond any existing or near-term hardware.
See also: 1000 Physical per 1 Logical, Million Qubit Requirement
Error Rate Problem
Current quantum gate operations have error rates of approximately \(10^{-3}\) (1 in 1,000), while useful fault-tolerant computation requires error rates below approximately \(10^{-6}\) or better. This three-order-of-magnitude gap has narrowed only slowly over decades.
The error rate problem is distinct from decoherence — even during coherence time, individual gate operations introduce errors. These errors compound rapidly in multi-step computations, making long quantum calculations unreliable without error correction.
Example: A 100-gate quantum circuit with 1% gate error rates has only about a 37% chance of producing the correct answer, making practical multi-step quantum algorithms impossible without error correction.
See also: Error Correction Overhead, Error Rates Must Drop 100x
Error Rates Must Drop 100x
The requirement that quantum gate error rates must decrease by approximately two orders of magnitude — from current levels of roughly \(10^{-3}\) to approximately \(10^{-5}\) or better — to enable practical error correction.
Error rate improvement has been painfully slow. Achieving a 100x improvement would require fundamental advances in qubit fabrication, control electronics, and environmental isolation that have no guaranteed timeline.
Ethical Investment Duty
The obligation of financial professionals to provide honest, evidence-based assessments of quantum computing investments to their clients, rather than participating in hype-driven narratives that may not serve client interests.
Investment advisors who recommend quantum computing investments without disclosing the technology's TRL 2-3 status, zero commercial returns, and high probability of failure may be failing their fiduciary duties.
See also: Fiduciary Responsibility
Ethics of QC Education
The ethical responsibilities of educators when teaching quantum computing, including the obligation to present realistic assessments of the field's prospects rather than uncritically promoting career paths in a potentially declining field.
Educators face a conflict between institutional incentives to attract students (which favor optimistic framing) and their duty to provide honest career guidance (which requires acknowledging significant uncertainty about the field's future).
Evaluating a Pitch Deck
The skill of critically analyzing quantum computing startup pitch decks, identifying common patterns of exaggeration, missing information, unrealistic assumptions, and red flags that indicate the opportunity may not warrant investment.
Pitch deck evaluation requires combining technical knowledge (is this claim physically plausible?), financial analysis (do these projections have any basis?), and bias awareness (what cognitive traps is this presentation designed to trigger?).
Example: A pitch deck claiming a "$100 billion addressable market for quantum optimization" should trigger scrutiny: What specific optimization problems? Who are the current customers? What do classical alternatives cost? What is the quantum speedup, quantitatively?
Exit Strategy Problem
The difficulty facing quantum computing investors in achieving a profitable exit (sale or IPO) given that the technology has not demonstrated commercial viability and the addressable market remains speculative.
In venture capital, exit strategy is paramount. Quantum computing investments face a narrowing set of exit options: acquisition by a large tech company (at what premium?), IPO (with what revenue story?), or write-off.
Expected Value Framework
A decision-making approach that multiplies the probability of each possible outcome by its value and sums them. For quantum computing investments: \(E[V] = P(\text{success}) \times \text{payoff} - \text{cost}\).
Expected value calculations consistently show that quantum computing investments are poor bets when realistic probabilities are used. The key variable is the probability of achieving fault-tolerant quantum computing within the investment horizon.
Example: If quantum computing has a 10% chance of generating $10 billion in total market value and requires $50 billion in investment, the expected value is \((0.10 \times \$10B) - \$50B = -\$49B\).
Extraordinary Claims Rule
The principle, attributed to Carl Sagan (echoing David Hume), that extraordinary claims require extraordinary evidence. Claims of revolutionary commercial impact from a TRL 2-3 technology with zero commercial returns constitute extraordinary claims.
This principle provides a simple heuristic for calibrating the evidence threshold. The more extraordinary the claim about quantum computing's future, the stronger the evidence required to support it.
Example: The claim "quantum computing will create $450 billion in value by 2035" is extraordinary and should require evidence proportional to its magnitude — evidence that does not currently exist.
Failed Physics Bets
Historical cases where significant investment in physics-based technologies failed to produce commercial returns, including cold fusion, superconducting electronics, and various nanotechnology applications.
Failed physics bets provide essential base-rate data for evaluating quantum computing. The pattern of initial excitement, sustained investment, and eventual failure is disturbingly familiar.
Feynman's 1981 Idea
Richard Feynman's seminal 1981 lecture proposing that quantum mechanical systems could only be efficiently simulated by quantum computers, often cited as the birth of quantum computing as a concept.
Feynman's idea was brilliant theoretical physics, but it has been transformed by proponents into a narrative of inevitable technological progress. The leap from "quantum systems are hard to simulate classically" to "therefore we should build quantum computers" involves assumptions that remain unvalidated.
Fiber Optics Payoff
The development of fiber optic telecommunications from laboratory demonstration (1970s) to global deployment (1980s-1990s), transforming internet infrastructure and generating trillions in economic value.
Fiber optics succeeded because each generation of the technology provided clear, measurable improvements in bandwidth and cost per bit, creating strong market demand at every stage of development.
Fiduciary Responsibility
The legal and ethical obligation of investment managers, pension fund directors, and financial advisors to act in their clients' best interests, which requires honest risk assessment rather than trend-following in quantum computing investments.
Fiduciary responsibility creates a legal framework for evaluating quantum computing investment decisions. Investing client funds in a TRL 2-3 technology with zero commercial returns requires thorough documentation of the risk-reward analysis.
Example: A pension fund manager who allocates 5% of assets to quantum computing stocks based on McKinsey projections without conducting independent technical due diligence may be breaching fiduciary duty.
First Lab Demos 1998-2001
Early laboratory demonstrations of quantum computing principles using nuclear magnetic resonance (NMR) and a few trapped ions, performing operations on 2-7 qubits.
These demonstrations proved that quantum operations were physically possible but operated at scales trillions of times too small for practical use. They established a pattern where small-scale proofs of concept were extrapolated into sweeping predictions about near-term commercial viability.
Flying Cars Parallel
The comparison between quantum computing and flying cars as frequently predicted technologies that face fundamental economic and practical barriers beyond mere engineering, including cost, safety, infrastructure, and questionable demand.
The flying car analogy highlights that being technically possible does not make something economically viable. Flying cars exist as prototypes, but the economics of operating them make them impractical for mass adoption — a dynamic that may apply equally to quantum computing.
FOMO Drives Bad QC Decisions
The pattern where Fear Of Missing Out overrides rational investment analysis, causing organizations to invest in quantum computing not because the expected value is positive but because they fear competitive disadvantage if they do not.
FOMO is an emotional driver masquerading as strategic thinking. It can be distinguished from genuine strategic investment by asking: "If we knew for certain that quantum computing would not be commercially viable for 20 years, would we still make this investment now?"
FOMO in QC Investment
Fear Of Missing Out — the anxiety that competitors will gain an advantage by investing in quantum computing, driving organizations to invest defensively rather than based on rigorous cost-benefit analysis.
FOMO is a primary driver of corporate and government quantum computing investment. The fear of being "left behind" overrides rational assessment of whether the technology will deliver returns within the investment horizon.
Example: A Fortune 500 CTO authorizing a $20 million quantum computing pilot program "so we're not behind our competitors" is making a FOMO-driven decision, not an expected-value-driven one.
See also: FOMO Drives Bad QC Decisions
Fusion Energy Parallel
The comparison between quantum computing and controlled nuclear fusion as technologies that have been "30 years away" for decades, consuming enormous investment with no commercial returns.
The fusion parallel is instructive because fusion faces similar dynamics: genuine scientific progress, enormous government funding, strong institutional incentives to continue, and a perpetually receding practical timeline. Both fields exhibit the "3-5 years away" pattern.
Example: In 1976, the U.S. government projected commercial fusion by 2005. In 2005, it was projected for 2035. In 2025, ITER targets first plasma by 2035 with commercial reactors "decades" later.
Fusion Hype Comparison
A comparative analysis of quantum computing and fusion energy hype patterns, revealing striking similarities: perpetually receding timelines, massive investment despite no commercial returns, strong institutional incentives to continue, and genuine but insufficient scientific progress.
The fusion comparison is the most powerful historical parallel because the dynamics are nearly identical and have played out over a similar multi-decade timescale.
Example: Both quantum computing and fusion energy have consumed over $100 billion in cumulative investment, promise revolutionary transformation, and have been "10-20 years away" from commercialization for over 30 years.
Gartner Hype Cycle
A graphical model depicting the typical trajectory of technology expectations through five phases: Innovation Trigger, Peak of Inflated Expectations, Trough of Disillusionment, Slope of Enlightenment, and Plateau of Productivity.
Gartner has consistently placed quantum computing near the Peak of Inflated Expectations, with an estimated 10+ years to the Plateau of Productivity. This assessment from a mainstream technology advisory firm aligns with the skeptical view presented in this course.
See also: Peak of Inflated Expectations, Trough of Disillusionment
General Purpose Technology
A technology that is pervasive (used across many sectors), improves continuously over time, and spawns complementary innovations — characteristics shared by historical transformations like steam, electricity, transistors, and the internet.
The GPT framework provides a rigorous test for quantum computing's claimed transformative potential. If quantum computing cannot satisfy the core GPT criteria, it cannot deliver the economy-wide transformation that justifies its investment levels.
See also: GPT Characteristics, QC Fails Every GPT Test
Geopolitical Arms Race Loop
The reinforcing loop where one nation's quantum computing investment triggers competitive investment from rivals, which triggers further investment by the first nation. This dynamic is largely independent of technical progress.
The geopolitical loop may be the most powerful driver of quantum computing funding because national security arguments override cost-benefit analysis. "China is investing $15 billion" is politically more compelling than "the technology doesn't work yet."
Example: The U.S. National Quantum Initiative (2018) was partly motivated by Chinese quantum investments, which were partly motivated by U.S. investments — each side escalating in response to the other regardless of technical milestones.
Gil Kalai's Math Arguments
Hebrew University mathematician Gil Kalai's theoretical arguments that quantum error correction cannot work in principle due to correlated noise and the computational complexity of implementing it, suggesting fault-tolerant quantum computing is mathematically impossible.
Kalai's arguments are significant because they challenge quantum computing at the mathematical level, not merely the engineering level. If correct, they mean no amount of funding or effort can make fault-tolerant quantum computing work.
Goldreich's CS Perspective
Theoretical computer scientist Oded Goldreich's skepticism about quantum computing from a computational complexity perspective, questioning whether quantum speedups will survive the refinement of classical algorithms.
Goldreich's perspective highlights that quantum advantage claims are measured against current classical algorithms, which continue to improve. Many claimed advantages have been reduced or eliminated by classical algorithmic improvements.
Google Buys D-Wave 2013
Google's purchase of a D-Wave Two quantum annealer and establishment of the Quantum Artificial Intelligence Lab, lending significant credibility and visibility to quantum computing.
Google's involvement attracted enormous media attention and investment interest, but their own researchers later published mixed results about whether D-Wave's systems demonstrated true quantum speedup.
Google Supremacy Overhype
The gap between Google's 2019 quantum supremacy announcement and its actual significance: the task had no practical application, the classical baseline was disputed, and the result did not demonstrate progress toward useful quantum computing.
Google's supremacy claim illustrates how a technically narrow achievement can be amplified into a misleading narrative of quantum computing's imminent practicality through strategic publication, media management, and ambiguous language.
Google Sycamore Claim 2019
Google's announcement that its 53-qubit Sycamore processor performed a specific calculation in 200 seconds that would take a classical supercomputer an estimated 10,000 years, claiming "quantum supremacy."
This remains the most high-profile quantum computing milestone and the most contested. IBM immediately challenged the classical estimate, and the computed task (random circuit sampling) has no practical application. The event exemplifies how quantum milestones are optimized for publicity rather than utility.
See also: Quantum Supremacy Defined, Contrived Benchmarks
Government Funding Pressure
Political and geopolitical dynamics that drive government investment in quantum computing, including technology competition with China, lobbying by industry beneficiaries, and the difficulty of cutting funding for programs branded as national security priorities.
Government funding creates a self-reinforcing loop: public money sustains researchers and companies, who then lobby for continued funding. Once a program is framed as critical to national security, it becomes politically difficult to reduce regardless of technical progress.
GPS Atomic Clock Payoff
The deployment of atomic clocks in GPS satellites, creating a $300+ billion annual market in location-based services, navigation, and timing — one of the most successful quantum technology investments in history.
GPS demonstrates that quantum technology (atomic clocks) can generate enormous economic value when the application is practical, scalable, and meets genuine market demand — criteria quantum computing has yet to satisfy.
GPT Characteristics
The three defining properties of a General Purpose Technology: (1) broad applicability across many economic sectors, (2) continuous improvement over time with declining costs, and (3) ability to enable complementary innovations and new business models.
These three criteria provide an objective checklist for evaluating whether quantum computing qualifies as a GPT. The evidence strongly suggests it does not meet any of the three criteria.
Groupthink
The phenomenon where the desire for group consensus overrides realistic appraisal of alternatives. In the quantum computing community, questioning the technology's viability can result in social and professional costs that discourage honest assessment.
Groupthink is reinforced by the structure of quantum computing research: funding requires optimistic proposals, publication requires positive results, and conference invitations go to contributors rather than critics.
Example: A quantum computing researcher who publicly questions whether fault-tolerant quantum computing is achievable risks being excluded from grant panels, conference programs, and collaborative projects.
Grover's Search Algorithm
A quantum algorithm that searches an unsorted database of N items in \(O(\sqrt{N})\) steps, compared to \(O(N)\) for classical search. Provides a quadratic (not exponential) speedup.
Grover's algorithm is often cited as evidence of quantum computing's broad utility, but its mere quadratic speedup is far less impressive than Shor's exponential advantage. For most practical database sizes, this speedup does not justify the enormous overhead of quantum hardware.
Example: Searching a database of 1 million items would require about 1,000 quantum steps versus 1 million classical steps — a real speedup, but one that classical parallelism can often match or exceed at far lower cost.
Hardware Scaling Wall
The observation that quantum computing hardware has not demonstrated a clear, repeatable path to exponential scaling. Adding qubits introduces new noise sources, crosstalk, and control complexity that degrade performance.
Unlike classical computing, which benefited from Moore's Law for decades, quantum computing faces scaling challenges that get harder, not easier, as systems grow. Each doubling of qubits introduces qualitatively new engineering problems.
Example: Going from 50 to 100 superconducting qubits is not simply a matter of adding more circuits — crosstalk between qubits increases, calibration becomes exponentially more complex, and error rates tend to worsen.
Harvest Now Decrypt Later
A threat model where adversaries collect encrypted data today with the intention of decrypting it in the future when quantum computers become powerful enough, making currently secure communications retroactively vulnerable.
This is the most legitimate quantum security concern because long-lived secrets (state secrets, medical records, certain financial data) could theoretically be stored and later decrypted. However, the practical difficulty of storing petabytes of intercepted data for decades, combined with advancing post-quantum migration, limits this threat.
Example: Intelligence agencies might store intercepted diplomatic communications encrypted with RSA today, hoping to decrypt them in 20-30 years if a sufficiently powerful quantum computer exists.
How Biases Compound
The phenomenon where multiple cognitive biases reinforce each other, creating a self-sustaining system of distorted judgment that is far more resistant to correction than any single bias alone.
In quantum computing, sunk cost, FOMO, authority bias, confirmation bias, and groupthink form a mutually reinforcing system. An investor subject to all five biases simultaneously has virtually no chance of reaching an objective assessment without deliberate debiasing effort.
Example: An investor who has already committed capital (sunk cost), sees competitors investing (FOMO), reads endorsements from Nobel laureates (authority bias), selectively reads positive results (confirmation bias), and attends only quantum computing conferences (groupthink) faces compounding biases that almost guarantee overoptimistic conclusions.
How to Read a Press Release
A guide to critically evaluating corporate and institutional press releases about quantum computing breakthroughs, including identifying marketing language, omitted context, misleading metrics, and the distinction between claims and evidence.
Press releases are marketing documents, not scientific publications. They are designed to attract attention and investment, not to communicate objective scientific findings. Reading them critically requires specific skills.
Example: A press release stating "we achieved a 99.5% gate fidelity" should prompt questions: On which gate? One-qubit or two-qubit? On which qubit pair? What was the measurement methodology? How does this compare to the threshold needed for error correction?
Hype Detection Checklist
A structured set of questions for evaluating whether a technology claim is substantiated or hype-driven, including: Is there a paying customer? What is the classical baseline? Who benefits from this claim? What is the track record of the forecaster?
The hype detection checklist operationalizes the course's analytical framework into a practical tool that can be applied immediately to press releases, pitch decks, and technology forecasts.
Example: Applying the checklist to a claim that "quantum computing will revolutionize drug discovery by 2030" would reveal: no paying pharmaceutical customers, classical methods outperform current quantum approaches, and the forecaster is a quantum computing company seeking investment.
See also: Red Flags in Tech Claims, How to Read a Press Release
Hype Reinforcement Loop
The specific reinforcing feedback loop where media coverage generates investor interest, which funds companies, which fund PR and lobbying, which generates more media coverage — independent of any technical progress.
This loop is the engine of quantum computing investment. It can sustain investment for extended periods without any commercial returns, as long as each actor in the chain maintains their role.
Example: A quantum computing company raises $100 million, spends $5 million on PR, generates favorable media coverage, which attracts more investors, enabling the next funding round — all without producing any commercial revenue.
See also: Self-Sustaining Hype Cycle
IBM 1121-Qubit Chip 2023
IBM's release of the Condor processor with 1,121 superconducting qubits, the first quantum processor to exceed 1,000 qubits.
Despite the milestone, IBM pivoted its roadmap shortly after, emphasizing error correction and smaller but higher-quality qubit systems (Heron processors) over raw qubit count. This implicit admission that more noisy qubits do not equal more capability is telling.
Example: IBM's 2023 roadmap revision de-emphasized qubit scaling in favor of "utility-scale" computation with fewer, better qubits — essentially acknowledging that the previous qubit-count-driven strategy was misleading.
IBM 127-Qubit Chip 2021
IBM's release of the Eagle processor with 127 superconducting qubits, positioned as a major step in IBM's quantum computing roadmap toward thousands of qubits.
The qubit count made headlines, but raw qubit numbers without corresponding improvements in error rates and connectivity do not translate to increased computational power. This is a common pattern where qubit counts serve as marketing metrics rather than capability metrics.
IBM Cloud Access 2016
IBM's launch of the IBM Quantum Experience, providing public cloud access to small quantum processors and establishing quantum computing as a service (QCaaS) model.
Cloud access democratized quantum computing experimentation but also created a misleading impression of commercial readiness. Running toy problems on 5-qubit cloud systems is fundamentally different from solving useful problems at scale.
IBM Roadmap Revisions
IBM's pattern of publishing ambitious quantum computing roadmaps and subsequently revising them, often quietly, as milestones prove unachievable on the original timelines.
IBM's roadmap revisions are particularly instructive because IBM is one of the most credible organizations in quantum computing. If even IBM cannot meet its own published timelines, this suggests the barriers are more formidable than industry projections acknowledge.
Example: IBM's original 2020 roadmap predicted a 100,000-qubit system by 2033. Subsequent revisions have repeatedly adjusted both qubit targets and the metrics used to measure progress.
Independent vs Coupled Risks
The question of whether the required breakthroughs for quantum computing are statistically independent (making joint probability calculation valid) or coupled (where achieving one makes others easier or harder).
If breakthroughs are positively correlated (each one makes others more likely), joint probabilities are higher than the independence calculation suggests. If negatively correlated (solving one makes others harder, as in the error correction overhead paradox), probabilities are lower.
Information Asymmetry
The imbalance in technical knowledge between quantum computing insiders (researchers, company executives) and outsiders (investors, policymakers, media), which creates opportunities for the informed to exploit or mislead the uninformed.
Information asymmetry is particularly acute in quantum computing because the underlying physics is genuinely difficult to understand. This gives insiders significant power to shape narratives and frame results in ways that non-experts cannot easily evaluate.
Example: A quantum computing company's press release about "achieving quantum advantage" may be technically accurate in a narrow, redefined sense that non-expert investors would not distinguish from genuine commercial superiority.
Informed Consent for Careers
The principle that students and early-career professionals should receive honest, comprehensive information about the risks and uncertainties of quantum computing careers before committing years of their professional development.
Borrowing from medical ethics, informed consent requires disclosing material risks. A quantum computing career carries material risk of field contraction that should be disclosed alongside the potential rewards.
Infrastructure Cost
The total investment required to house, power, cool, shield, and maintain a quantum computing system, including the building, electrical supply, vibration isolation, electromagnetic shielding, and specialized technical staff.
Infrastructure costs are frequently excluded from quantum computing cost projections, creating misleadingly optimistic economic analyses. A realistic total cost of ownership must include all supporting systems, not just the quantum processor itself.
Example: IBM's quantum computing center in Yorktown Heights represents hundreds of millions of dollars in specialized infrastructure supporting a relatively small number of operational qubits.
Internet as GPT
The internet as a GPT demonstrating all three characteristics: broad applicability (commerce, communication, entertainment, education), continuous improvement (bandwidth, protocols, services), and enabling of unprecedented complementary innovations.
The internet had millions of users and clear commercial demand within a decade of commercialization. Quantum computing, after a comparable period of commercial availability (since ~2016 via cloud), has essentially no commercial users.
IonQ IPO and Stock Decline
IonQ's 2021 public listing via SPAC merger at a valuation of approximately $2 billion, followed by significant stock price decline as revenue failed to meet projections and the company reported substantial ongoing losses.
IonQ's trajectory illustrates the risks of investing in quantum computing companies based on technology promise rather than financial fundamentals. The company's market capitalization has fluctuated dramatically based on sentiment rather than revenue growth.
Example: IonQ went public at roughly $10 per share, peaked above $30, and subsequently experienced significant volatility, all while generating less than $25 million in annual revenue against hundreds of millions in cumulative losses.
Joint Probability Problem
The mathematical reality that if each of 10 required breakthroughs has an independent probability \(p\) of occurring, the probability of all 10 occurring is \(p^{10}\). Even with a generous 80% probability per breakthrough, the joint probability is only about 10%.
The joint probability calculation is one of the most powerful analytical tools in this course for converting qualitative skepticism into quantitative risk assessment.
Example: If each of 10 required breakthroughs has a 70% chance of success, the probability of all succeeding is \(0.7^{10} \approx 2.8\%\). Even at 80% per breakthrough, the joint probability is only \(0.8^{10} \approx 10.7\%\).
Laser Investment Payoff
The laser's development from theory (1958) to commercial applications (manufacturing, telecommunications, medicine) within a decade, illustrating how viable physics-based technologies find commercial uses quickly.
The laser is often called "a solution looking for a problem," but it found those problems rapidly. Quantum computing has been looking for commercially viable problems for decades without success.
Laughlin on Coherence Limits
Nobel laureate Robert Laughlin's arguments that quantum coherence in many-body systems faces fundamental physical limits related to thermodynamics and the emergence of classical behavior at macroscopic scales.
Laughlin's critique, grounded in condensed matter physics, suggests that maintaining coherence across large quantum systems may violate deep principles of statistical mechanics.
Leverage Points
Places in a system where small changes can produce large effects, as identified by Donella Meadows. In the quantum computing investment system, potential leverage points include transparency requirements, independent technical review, and changing incentive structures.
Understanding leverage points helps identify practical interventions that could improve quantum computing investment decision-making without requiring the entire system to change simultaneously.
Example: Requiring quantum computing companies to publish audited benchmarks comparing their systems to classical alternatives on the same problems would be a leverage point that introduces a balancing feedback loop.
Levin's Complexity Doubts
Computer scientist Leonid Levin's skepticism that quantum computing can deliver on its promises, based on computational complexity theory arguments about the relationship between quantum and classical computation.
Levin's perspective as a co-discoverer of NP-completeness lends significant weight to his skepticism about quantum computing's computational claims.
Making Better Tech Bets
The capstone skill of using the analytical frameworks from this course — expected value, base rates, bias recognition, systems thinking, and skeptical inquiry — to make more rational technology investment and policy decisions across all domains.
Making better tech bets is the ultimate purpose of this course. Quantum computing serves as a case study for developing transferable judgment about technology claims, investment risks, and the cognitive traps that lead to poor decisions.
Example: A student who completes this course should be able to evaluate any technology investment — quantum computing, AI, fusion, biotech — by asking: What is the base rate of success? What is the expected value? What biases might be affecting my judgment? What do independent critics say?
Market Valuation vs Revenue
The extreme disconnect between quantum computing companies' stock market valuations and their actual revenue, with price-to-revenue ratios often exceeding 100x — levels not justified by growth rates or market opportunity.
This valuation disconnect is a classic indicator of speculative excess. When company valuations are based entirely on hoped-for future technology breakthroughs rather than current financial performance, investors face enormous downside risk.
Example: A quantum computing company valued at $2 billion with $20 million in annual revenue trades at 100x revenue — a multiple that implies extraordinary future growth that has no precedent in the quantum computing industry.
Massive Qubits Needed
The estimated 10-20 million physical qubits required to break standard RSA-2048 encryption using Shor's algorithm with current error correction approaches, a figure roughly 10,000-20,000 times larger than the largest existing quantum processors.
This number should be the centerpiece of any honest discussion about the quantum cryptographic threat. The gap between current hardware (approximately 1,000 noisy qubits) and what is needed (millions of error-corrected qubits) is staggering.
McKinsey $450B Projection
McKinsey's widely cited projection that quantum computing could generate $450-850 billion in economic value by 2035, a figure built on assumptions about technical progress that have no historical basis.
This projection exemplifies how authoritative-sounding numbers can enter the investment discourse without adequate scrutiny. The projection assumes fault-tolerant quantum computers exist and are commercially deployed — the very outcome that is in question.
Example: Breaking down McKinsey's $450B figure reveals it assumes quantum computers solve problems in pharmaceuticals, chemicals, finance, and logistics by 2035 — a timeline requiring technical progress far beyond historical rates.
Media Amplification Effect
The process by which modest, incremental, or even disputed scientific results are transformed into dramatic headlines about quantum computing breakthroughs through press releases, science journalism, and social media.
Media amplification creates a distorted public perception where quantum computing appears to be advancing rapidly toward practical use. Understanding this amplification chain is essential for accurately assessing the state of the technology.
Example: A paper reporting a 0.1% improvement in qubit coherence time might become "Quantum Computing Breakthrough Could Revolutionize Drug Discovery" in mainstream media coverage.
Million Qubit Requirement
The estimated scale of physical qubits needed for commercially relevant quantum computations, typically estimated at 1 million to 100 million physical qubits depending on the application and error correction scheme.
No clear engineering pathway exists to build and control a million-qubit system. Current state-of-the-art processors have roughly 1,000 qubits, meaning a 1,000x scale-up is needed — each step introducing new engineering challenges in wiring, cooling, and control.
Example: Estimates for breaking RSA-2048 encryption require on the order of 20 million physical qubits operating with error rates not yet achieved, a scale roughly 20,000 times larger than today's largest processors.
Missing Balancing Loop
The absence or weakness of feedback mechanisms that should correct overinvestment: market discipline (no customers to provide revenue feedback), scientific accountability (publication bias favors positive results), and media scrutiny (science journalism amplifies hype).
Identifying missing balancing loops is a key systems thinking insight. In a healthy market, lack of revenue would reduce investment. In quantum computing, government funding, corporate strategic budgets, and VC FOMO substitute for absent market feedback.
Motivated Reasoning
The tendency to arrive at conclusions aligned with one's desires or interests, using reasoning as a tool for justification rather than truth-seeking. Quantum computing stakeholders have strong financial and career motivations to reach optimistic conclusions.
Motivated reasoning is distinct from deliberate deception — people genuinely believe their optimistic assessments because their cognitive processes are shaped by their incentives. This makes it a more pervasive and harder-to-detect problem than outright fraud.
Moving Goalposts Pattern
The practice of redefining what constitutes "progress" or "success" in quantum computing whenever previous goals are not met, ensuring that the field always appears to be advancing regardless of actual outcomes.
This pattern makes quantum computing claims effectively unfalsifiable — a major red flag in scientific evaluation. If the definition of success changes every time a benchmark is missed, no amount of failure can disprove the technology's viability.
Example: When qubit counts failed to deliver practical computation, the industry shifted to emphasizing "quantum volume." When quantum volume plateaued, the focus shifted to "utility-scale" and "error-mitigated" computation.
See also: Redefining Success, Unfalsifiable Timelines
MRI Investment Payoff
The development of magnetic resonance imaging from physics principles (nuclear magnetic resonance) to a multibillion-dollar medical imaging industry, demonstrating successful translation of quantum physics into commercial technology.
MRI used quantum physics principles in a practical, reliable, and commercially viable way. The technology worked at clinical scale within a few decades of discovery and now generates over $7 billion annually in equipment sales alone.
Must Be Broadly Applicable
The first GPT criterion: the technology must be useful across many different industries and applications, not confined to a narrow set of specialized problems.
Quantum computing fails this test. It offers theoretical speedups only for a narrow class of problems with specific mathematical structure, making it more analogous to a specialized scientific instrument than a general-purpose technology.
Example: The transistor is broadly applicable — it is used in every electronic device. Quantum computing's potential applications (certain optimizations, simulations, and cryptographic tasks) represent a tiny fraction of the computing market.
Must Enable New Innovations
The third GPT criterion: the technology must serve as a platform that enables entirely new products, services, and business models that were previously impossible.
Quantum computing has not yet enabled any new innovation that could not have been achieved classically. Until fault-tolerant quantum computers exist, this criterion cannot be evaluated empirically, which is itself a problem for investment based on GPT assumptions.
Must Improve Over Time
The second GPT criterion: the technology must demonstrate continuous improvement in capability and cost-effectiveness, enabling expanding applications over time.
Quantum computing's improvement trajectory has been far slower than GPT benchmarks. Qubit counts have grown, but error rates and coherence times have improved only incrementally, and cost per useful computation has not demonstrably decreased.
Narrative Fallacy
The human tendency to construct coherent stories from incomplete information, creating misleading causal explanations. In quantum computing, the narrative "quantum mechanics is weird and powerful, therefore quantum computers will be transformative" is compelling but logically flawed.
The quantum computing narrative — from Feynman's vision through Shor's algorithm to the "quantum race" — is a compelling story that feels inevitable. But narrative coherence is not evidence of technical feasibility or economic viability.
Example: "Quantum mechanics revolutionized physics, therefore quantum computing will revolutionize computing" sounds logical but is a narrative fallacy — the truth of the first clause does not imply the truth of the second.
Narrow Problem Applicability
The limitation that quantum speedups exist only for a small class of problems with specific mathematical structure, primarily integer factoring, discrete logarithms, unstructured search, and certain quantum simulations.
The vast majority of commercially important computations — databases, web services, AI training, business logic — have no known quantum speedup. This fundamentally limits quantum computing's potential market size and undermines GPT (General Purpose Technology) claims.
Example: Of the thousands of algorithms in use across industry, only a few dozen have known quantum speedups, and most of those speedups are polynomial rather than exponential.
National QC Strategy Critique
The skill of critically evaluating government quantum computing strategies and funding programs, assessing whether they are evidence-based or driven by hype, FOMO, and lobbying.
National strategies are particularly important to evaluate because government funding is often the largest source of quantum computing investment and the least subject to market discipline.
Example: A critique of a national quantum strategy would assess: Are the stated goals technically achievable? Are timelines realistic given historical progress? Are milestones measurable and falsifiable? What is the opportunity cost versus alternative technology investments?
Near Absolute Zero Temps
Operating temperatures of approximately 15 millikelvin (0.015 Kelvin, or -273.135 degrees Celsius) required by superconducting quantum processors. This is colder than outer space and requires sophisticated, expensive cooling infrastructure.
These extreme temperatures are necessary because thermal energy at even slightly higher temperatures would instantly destroy quantum coherence. The energy cost of maintaining these temperatures scales unfavorably as systems grow larger.
Example: Outer space has a background temperature of about 2.7 Kelvin — roughly 180 times warmer than the operating temperature of a superconducting quantum processor.
Need 1000x More Qubits
The requirement that quantum processors must scale from current levels of approximately 1,000 qubits to approximately 1 million or more qubits to support error correction for commercially relevant computations.
A 1,000x scale-up is not simply doing the same thing 1,000 times — it introduces qualitatively new engineering challenges in fabrication, wiring, control, cooling, and calibration that no existing approach has solved.
Net Present Value
The present value of future expected cash flows minus the initial investment, discounted at an appropriate rate. NPV analysis of quantum computing ventures must account for the probability-weighted timeline uncertainty of achieving commercial viability.
NPV analysis is devastating for quantum computing investments when realistic probabilities of success and timelines are used. A technology that might produce revenue in 15-20 years with a 10-20% probability of success has a strongly negative NPV at any reasonable discount rate.
Example: A $100 million quantum computing investment with a 15% probability of generating \(1 billion in revenue in 15 years, discounted at 10%, has an NPV of approximately -\)64 million.
Neutral Atom Approach
A quantum computing platform using arrays of individual neutral atoms trapped by focused laser beams (optical tweezers), with quantum gates performed via controlled excitation to Rydberg states.
Neutral atom systems have recently shown promise for scaling to hundreds of qubits in 2D and 3D arrays, with natural all-to-all connectivity through atom rearrangement. However, gate fidelities and speeds still lag behind trapped ion and superconducting approaches.
Example: QuEra's systems have demonstrated arrays of over 200 neutral rubidium atoms, but two-qubit gate fidelities remain below the threshold needed for effective error correction.
New Algorithms Needed
The requirement for discovery of new quantum algorithms that provide significant speedups on commercially important problems beyond the narrow set currently known (factoring, unstructured search, certain simulations).
Without new algorithms, quantum computing's commercial applicability is limited to a tiny fraction of the computing market, capping the technology's maximum economic value regardless of hardware progress.
NIST Already Has Standards
The U.S. National Institute of Standards and Technology finalized its first set of post-quantum cryptographic standards (FIPS 203, 204, 205) in August 2024, providing government-endorsed quantum-resistant algorithms.
NIST's standardization means the migration to quantum-safe cryptography can begin immediately, long before any quantum computer capable of threatening current encryption exists. This proactive defense further diminishes the urgency of the quantum cryptographic threat.
No Paying Customers
The observation that no organization is paying for quantum computing services because those services outperform classical alternatives on problems the organization actually needs solved.
This is the market's verdict on quantum computing's current utility. Despite cloud access being available from multiple providers, customers use it for education and experimentation, not production workloads.
No Platform Has Solved It
The observation that despite decades of research across multiple hardware approaches (superconducting, trapped ion, photonic, topological, neutral atom), no platform has achieved fault-tolerant quantum computation at commercial scale.
The fact that no platform has solved these fundamental challenges after billions of dollars and decades of research is itself evidence about the difficulty of the problem. Proponents argue this means we just need more time; skeptics argue it suggests the barriers may be fundamental.
No Real-World Advantage Yet
The empirical observation that no quantum computer has solved any commercially relevant problem faster or cheaper than a classical computer as of 2025, despite over $100 billion in cumulative investment.
This is perhaps the single most important fact in the entire quantum computing investment debate. After 40+ years of research and massive investment, quantum computing has produced zero commercial return. Any honest assessment must grapple with this reality.
Opportunity Cost
The value of the best alternative investment forgone when capital is allocated to quantum computing. Every dollar invested in quantum computing could instead fund classical AI hardware, quantum sensing, or other technologies with proven commercial potential.
Opportunity cost is the most overlooked factor in quantum computing investment decisions. The relevant question is not "Could quantum computing eventually work?" but "Is this the best use of these funds compared to all alternatives?"
Example: The $100+ billion invested in quantum computing could have funded approximately 10 years of the entire U.S. National Institutes of Health budget, or built thousands of classical computing data centers.
Optimism Bias
The tendency to overestimate the probability of positive outcomes and underestimate the likelihood of negative ones. In quantum computing, proponents systematically overestimate the chances of technical breakthroughs and underestimate remaining barriers.
Optimism bias is amplified in quantum computing by the alignment of personal incentives (career advancement, funding) with optimistic projections. Researchers and companies that project success receive more funding than those who express realistic uncertainty.
Peak of Inflated Expectations
The phase of the Gartner Hype Cycle characterized by excessive enthusiasm, unrealistic projections, and overinvestment, driven by early success stories and media amplification rather than demonstrated commercial value.
Quantum computing has been at or near this peak since approximately 2018-2022. The SPAC boom, massive VC funding, and McKinsey-style projections are hallmarks of peak inflated expectations.
PhD Career Risk in QC
The risk that students who invest 4-7 years in quantum computing doctoral programs may find limited career options if the field contracts or pivots, having specialized in a narrow area with few commercial applications.
PhD career risk is an underexamined ethical dimension of quantum computing hype. Unlike software engineering or data science, quantum computing PhD skills are not easily transferable if commercial demand fails to materialize.
Example: A student beginning a quantum computing PhD in 2025 would graduate around 2030-2032, potentially entering a job market that has passed through the Trough of Disillusionment with far fewer positions available.
Photonic Approach
A quantum computing platform that uses individual photons (particles of light) as qubits, manipulated through optical circuits including beam splitters, phase shifters, and single-photon detectors.
Photonic systems can operate at room temperature, avoiding cryogenic costs, but they face fundamental challenges: photons do not naturally interact with each other, making two-qubit gates probabilistic and extremely inefficient. Photon loss in optical circuits also introduces significant errors.
Example: Xanadu's Borealis system demonstrated a form of quantum advantage on Gaussian boson sampling in 2022, but this task has no known commercial application.
1000 Physical per 1 Logical
The approximate ratio of physical qubits needed to implement one fault-tolerant logical qubit using current error correction codes like the surface code. Estimates range from 1,000 to 10,000 depending on physical error rates and code distance.
This ratio is the single most damaging number for quantum computing's commercial viability. It means that a "1,000-qubit" quantum computer actually provides only about 1 logical qubit of useful computation — a fact often omitted from marketing materials and press releases.
Example: IBM's 1,121-qubit Condor processor (2023) provides, by this metric, approximately 1 logical qubit — insufficient to run any commercially useful quantum algorithm.
Platform Comparison
A systematic evaluation of competing quantum hardware approaches across metrics including qubit count, gate fidelity, coherence time, gate speed, connectivity, operating temperature, and scalability potential.
No single platform excels on all metrics, and each faces distinct fundamental challenges. This lack of a clear winning approach after decades of parallel development is itself an indicator of the difficulty of building practical quantum computers.
Portfolio Allocation Analysis
The quantitative process of determining optimal allocation of investment capital across asset classes, including determining appropriate (typically minimal) exposure to quantum computing given its risk-return profile.
Portfolio allocation analysis provides a mathematical framework for the investment decision. When quantum computing is evaluated alongside alternatives using standard portfolio theory (risk, return, correlation), it typically receives minimal or zero allocation.
Portfolio Diversification
The investment strategy of spreading capital across multiple assets to reduce risk. In a technology portfolio, diversifying away from quantum computing toward quantum sensing, AI hardware, and proven technologies reduces exposure to quantum computing's high failure probability.
Portfolio theory provides a quantitative framework for determining appropriate quantum computing allocation. Given the high uncertainty and long timeline, most portfolio optimization models would recommend minimal allocation to quantum computing.
Post-Quantum Crypto Exists
Cryptographic algorithms resistant to quantum attacks already exist and are being standardized. NIST finalized its first post-quantum cryptography standards in 2024, providing quantum-resistant alternatives to RSA and ECC.
The existence of post-quantum cryptography fundamentally undermines the urgency narrative around quantum threats to encryption. The defense is being deployed before the attack capability exists — a rare and favorable situation in cybersecurity.
Example: CRYSTALS-Kyber (ML-KEM) and CRYSTALS-Dilithium (ML-DSA) are NIST-standardized post-quantum algorithms already being integrated into web browsers, TLS protocols, and government systems.
Prediction Track Records
The historical accuracy of specific forecasters' predictions about quantum computing, revealing systematic overoptimism among proponents and more accurate assessments from skeptics.
Evaluating forecasters' track records is one of the most reliable methods for calibrating trust in new predictions. In quantum computing, proponents with financial stakes have consistently worse prediction records than independent skeptics.
Probability of Success
The estimated likelihood that quantum computing will achieve commercial viability within a given timeframe, based on the joint probability of all required technical breakthroughs occurring together.
Honest estimates of quantum computing's probability of commercial success range from 5-20% within 20 years, depending on assumptions. These estimates must account for the joint probability problem — multiple independent breakthroughs must all succeed.
See also: Joint Probability Problem
Professor Grant Conflicts
The conflict of interest faced by professors who depend on quantum computing grants for their lab funding, graduate student support, and career advancement, creating incentives to project optimism about the field's prospects.
This conflict is structural, not personal — the academic funding model inherently creates bias toward optimism in assessments of the fields that fund research. Professors are not dishonest; the system selects for and rewards optimistic framing.
Proponents Have Funding Bias
The observation that quantum computing proponents — including researchers, company executives, consultants, and government program managers — typically have direct financial interests tied to continued investment and optimistic projections.
Funding bias does not mean proponents are dishonest — it means their judgment is systematically shaped by incentives that favor optimism. This is exactly the kind of bias that cognitive science has shown reliably distorts forecasting.
Public Key Encryption Basics
Cryptographic systems like RSA and elliptic curve cryptography that rely on mathematical problems (integer factoring, discrete logarithms) believed to be computationally intractable for classical computers. They secure internet communications, financial transactions, and government secrets.
Understanding the basics of public key cryptography is necessary to evaluate the quantum threat accurately. Many quantum computing investment pitches exploit fear about encryption being "broken" without explaining the enormous technical requirements.
QC Cannot Replace Classical
The fundamental reality that quantum computers cannot perform general-purpose computation and therefore cannot replace classical computers. Quantum computers would at best be specialized accelerators for specific problem types, always requiring classical computers for pre- and post-processing.
This means quantum computing, even if fully successful, would be more like a GPU (specialized accelerator) than a CPU (general-purpose processor) — valuable in specific applications but not a platform for broad economic transformation.
QC Fails Every GPT Test
The assessment that quantum computing does not satisfy any of the three General Purpose Technology criteria: it is not broadly applicable, has not demonstrated continuous improvement in commercial capability, and has not enabled new innovations.
This analysis is one of the strongest arguments against quantum computing as a transformative investment. Technologies that fail all three GPT tests have never historically delivered economy-wide returns.
QC Is Narrowly Applicable
The limitation that quantum computing offers theoretical speedups only for problems with specific mathematical structure — primarily integer factoring, unstructured search, and certain quantum simulations — which represent a tiny fraction of the global computing market.
Narrow applicability fundamentally caps quantum computing's maximum potential market size, even under the most optimistic assumptions about technical progress. This undermines projections of hundreds of billions in economic value.
QC Job Market Reality
The current quantum computing job market, which consists almost entirely of research positions in academia, national labs, and R&D departments — with virtually no positions in commercial quantum computing operations.
The job market provides a market-based signal about quantum computing's commercial readiness. A technology approaching commercial viability would show growing demand for operations, sales, and support roles — not just researchers.
QC Stuck at TRL 2-3
The assessment that most quantum computing technology remains at TRL 2 (technology concept formulated) to TRL 3 (analytical and experimental proof of concept), despite often being marketed as though it were at TRL 5-6.
The gap between actual TRL and marketed TRL represents one of the most significant sources of investor deception in the quantum computing industry. Recognizing this gap is essential for realistic investment assessment.
Example: Running a 100-qubit noisy computation with no error correction is approximately TRL 3. Fault-tolerant quantum computation has not yet reached TRL 3 on any platform.
QC vs Classical Computing
A comparative analysis of quantum and classical computing across dimensions including speed, reliability, cost, energy consumption, problem applicability, and commercial readiness. Classical computing excels on nearly every practical metric today.
This comparison is essential for evaluating investment claims. When proponents claim "quantum advantage," the critical question is always: compared to what classical baseline, on what problem, at what cost?
Example: Google's 2019 Sycamore claim of quantum supremacy was challenged when IBM showed the same calculation could be done classically in 2.5 days rather than the claimed 10,000 years.
Quantum Advantage Defined
A proposed milestone where a quantum computer solves a practical, commercially relevant problem faster, cheaper, or better than any classical alternative. Unlike quantum supremacy, quantum advantage requires real-world utility.
No quantum computer has demonstrated quantum advantage by this definition as of 2025. The distinction between supremacy (artificial benchmarks) and advantage (real problems) is critical for evaluating investment claims.
Quantum Gravimeters
Instruments that use quantum interference of cold atoms to measure gravitational acceleration with extreme precision, used in mineral exploration, civil engineering, and geophysical research.
Quantum gravimeters illustrate the pattern that quantum sensing succeeds commercially because it exploits quantum effects in small, well-controlled systems — the opposite of quantum computing's requirement for large, coherent multi-qubit systems.
Quantum Key Distribution
A quantum communication protocol that uses quantum mechanical properties to securely distribute cryptographic keys, with the theoretical guarantee that any eavesdropping would be detectable. Commercially deployed in limited niche applications.
QKD is sometimes grouped with quantum computing, but it is a distinct technology with different requirements and commercial prospects. It has been commercially deployed, though its niche market and high cost limit its impact relative to post-quantum cryptography.
Quantum Magnetometers
Sensing devices that use quantum effects to measure magnetic fields with extraordinary sensitivity, used in medical imaging (magnetoencephalography), geological surveying, and military applications.
Quantum magnetometers are commercially deployed and revenue-generating, demonstrating that quantum technology applications do not require the massive qubit systems and error correction that make quantum computing impractical.
Quantum Measurement Problem
The fundamental issue that observing a quantum system collapses its superposition into a single definite state, destroying the quantum information. Measurement yields probabilistic outcomes, requiring many repeated runs to extract useful results.
The measurement problem imposes hard limits on what information can be extracted from a quantum computation. You cannot simply "read out" the full quantum state — you get one probabilistic sample per measurement, which constrains algorithm design and increases computational overhead.
Example: Running Grover's algorithm once does not guarantee the correct answer; the algorithm must be run multiple times to build statistical confidence in the result.
Quantum Sensing
The use of quantum mechanical effects (superposition, entanglement, quantum interference) in measurement devices to achieve sensitivities beyond classical limits. Unlike quantum computing, quantum sensing has demonstrated commercial viability.
Quantum sensing is the most important contrast in this course because it shows that quantum technology can work commercially — the problem is specifically with quantum computation, not with quantum physics applications broadly.
Example: Quantum magnetometers, atomic clocks, and quantum gravimeters are already deployed commercially and generating revenue, with market sizes in the billions of dollars.
See also: Atomic Clocks, Quantum Magnetometers, Quantum Gravimeters
Quantum Supremacy Defined
A milestone where a quantum computer performs a specific computation that no classical computer can complete in a reasonable time. Originally proposed by John Preskill in 2012 as a theoretical benchmark, not a measure of practical utility.
Quantum supremacy demonstrations have used highly contrived problems with no practical applications. The term has been criticized as misleading because it implies broad computational superiority when it only demonstrates an advantage on one narrow, artificial task.
Example: Google's 2019 Sycamore experiment claimed supremacy on random circuit sampling — a task specifically designed to be hard for classical computers but that has no known commercial use.
See also: Quantum Advantage Defined, Contrived Benchmarks
Qubits Are Extremely Fragile
Quantum states are destroyed by thermal noise, electromagnetic interference, vibrations, stray magnetic fields, and even cosmic rays. Maintaining quantum coherence requires extraordinary isolation from the environment.
This fragility is not merely an engineering inconvenience — it reflects fundamental physics. Any interaction with the environment constitutes a measurement that collapses the quantum state. This is why quantum computers require extreme cooling and shielding.
Example: A single photon of stray light or a temperature fluctuation of millionths of a degree can destroy the quantum state of a superconducting qubit.
Red Flags in Tech Claims
Specific warning signs that a technology claim may be exaggerated or unfounded: no independent verification, vague timelines, redefined metrics, appeals to authority, suppression of criticism, and extraordinary claims without extraordinary evidence.
Learning to recognize red flags is a practical skill that protects against technology hype in any domain, from quantum computing to AI to biotech.
Example: A press release claiming "quantum advantage achieved" that does not specify the classical baseline, uses a contrived benchmark, and is not published in a peer-reviewed journal raises multiple red flags.
Redefining Success
The practice of retroactively changing the criteria for quantum computing milestones after the original criteria are not met, allowing the narrative of progress to continue uninterrupted.
Redefining success is closely related to moving goalposts but focuses on specific instances where a claimed achievement does not actually meet the originally stated benchmark.
Example: Google's "quantum supremacy" was originally supposed to demonstrate a clear, unambiguous advantage. When IBM disputed the classical baseline, the term was softened to "quantum computational advantage" with looser criteria.
Reinforcing Feedback Loop
A causal loop where the output amplifies the input, creating exponential growth or decline. In the quantum computing ecosystem, multiple reinforcing loops sustain and amplify investment and hype.
Reinforcing loops explain the explosive growth of quantum computing investment despite no commercial returns. Each loop — hype, funding, career incentives, geopolitical competition — amplifies the others.
Return on Investment
The ratio of net profit to total investment cost, expressed as a percentage. In quantum computing, ROI has been universally negative across all companies and government programs through 2025.
ROI is the ultimate measure of whether quantum computing investment is justified. Despite over $100 billion in cumulative global investment, no quantum computing venture has achieved positive ROI from quantum computation services.
Example: IonQ, a publicly traded quantum computing company, had cumulative revenues of approximately $50 million through 2024 against over $600 million in total invested capital — a deeply negative ROI.
See also: Net Present Value, Risk-Adjusted Returns
Revenue Model Problem
The fundamental challenge that quantum computing lacks a proven business model: it is unclear who will pay for quantum computation, what problems they would solve, and how prices would compare to classical alternatives.
Without a credible revenue model, quantum computing companies rely on government grants, corporate R&D budgets, and investor capital — none of which represent sustainable commercial demand.
Rigetti Financial Struggles
Rigetti Computing's ongoing financial difficulties following its 2022 SPAC-enabled public listing, including low revenue, high operating losses, declining stock price, and concerns about cash runway.
Rigetti's struggles exemplify the financial reality facing pure-play quantum computing companies: the technology cannot yet generate sufficient revenue to sustain operations, leaving companies dependent on continued capital raises.
Risk Assessment Framework
A structured approach to identifying, quantifying, and evaluating the risks associated with quantum computing investment, including technology risk, market risk, execution risk, competitive risk, and timeline risk.
Comprehensive risk assessment for quantum computing must include a category rarely present in technology investing: fundamental physics risk — the possibility that the underlying approach is physically impossible at commercial scale.
Risk-Adjusted QS Returns
The investment returns from quantum sensing technologies after adjusting for risk, which are substantially more favorable than quantum computing returns due to proven commercial demand, lower technical risk, and existing revenue streams.
Risk-adjusted return comparison between quantum sensing and quantum computing starkly illustrates the irrationality of current investment allocation, which favors the speculative (computing) over the proven (sensing).
Risk-Adjusted Returns
Investment returns modified to account for the probability and magnitude of various outcomes, including failure. Risk adjustment is essential for quantum computing investments because the probability of total loss is high.
Standard venture capital risk-adjustment methods may understate quantum computing risk because they were developed for software and biotech ventures where fundamental physics barriers are rare. Quantum computing combines technology risk, market risk, and execution risk at unusually high levels.
Science Journalism Problems
Structural issues in science journalism that amplify quantum computing hype, including reporters' lack of technical expertise, dependence on sources with conflicts of interest, incentives to produce exciting narratives, and pressure to publish quickly.
Science journalism serves as a critical transmission mechanism in the hype cycle, transforming nuanced scientific results into dramatic headlines that shape public and investor perception.
Science vs Engineering Gap
The distinction between demonstrating a scientific principle in a laboratory setting and engineering a commercially viable product based on that principle. Many technologies succeed scientifically but fail commercially.
This gap is enormous in quantum computing. The science of individual qubits is well established, but the engineering of million-qubit fault-tolerant systems is unsolved and may be unsolvable with current approaches.
Example: The science of flight was understood by the 1890s, but engineering commercially viable airlines took 60+ years and required entirely different technologies (jet engines, aluminum airframes) than early demonstrations used.
Self-Sustaining Hype Cycle
The complete system of reinforcing feedback loops (media hype, VC funding, government funding, career incentives, geopolitical competition) that sustains quantum computing investment independently of technical progress or commercial viability.
This concept synthesizes all the individual loops into a unified understanding of why quantum computing investment continues despite poor results. The system is self-sustaining because no single actor can or wants to stop the cycle.
Sensing Needs Few Qubits
The key architectural difference between quantum sensing (which typically uses one or a few quantum systems) and quantum computing (which requires millions of coherent, entangled qubits), explaining why sensing succeeds commercially while computing does not.
This distinction explains the radically different commercial trajectories. Quantum sensing works because it does not face the scaling, error correction, or multi-qubit coherence challenges that make quantum computing impractical.
Sensors Already Make Money
The empirical fact that quantum sensing technologies are generating commercial revenue and positive ROI, in sharp contrast to quantum computing's zero commercial returns.
This contrast is one of the most powerful arguments in the course: quantum technology works commercially when it does not require the massive scale and coherence that quantum computing demands.
Example: The global market for quantum sensors (atomic clocks, magnetometers, gravimeters) exceeds $1 billion annually and is growing steadily — while quantum computing services generate approximately zero commercial revenue.
Shor's Algorithm 1994
Peter Shor's 1994 discovery of an efficient quantum algorithm for integer factoring, which provided the first compelling theoretical evidence that quantum computers could solve important problems exponentially faster than classical machines.
Shor's algorithm triggered the modern era of quantum computing investment by raising the specter of broken encryption. However, 30+ years later, it has only been run to factor the number 21, illustrating the vast gap between theoretical algorithms and practical implementation.
Shor's Factoring Algorithm
A quantum algorithm discovered by Peter Shor in 1994 that can factor large integers in polynomial time, exponentially faster than the best known classical algorithms. It represents the strongest theoretical case for quantum computing's potential.
Shor's algorithm is the primary justification for fears about quantum computing breaking encryption. However, running it on cryptographically relevant key sizes requires millions of physical qubits with error rates far below current capabilities, and post-quantum cryptography standards are already being deployed.
Example: The largest number factored by Shor's algorithm on actual quantum hardware is 21 (= 3 x 7), accomplished in 2012. Factoring a 2,048-bit RSA key would require roughly 4,000 logical qubits, or approximately 4-20 million physical qubits.
See also: Could QC Break Encryption?, Post-Quantum Crypto Exists
Skeptical Inquiry Method
A systematic approach to evaluating technology claims that combines base rate reasoning, evidence assessment, bias identification, incentive analysis, and probabilistic thinking into a repeatable analytical framework.
The skeptical inquiry method synthesizes the individual critical thinking tools taught throughout this course into an integrated methodology. It is the primary analytical output of the course and the skill most directly applicable beyond quantum computing.
Skeptics Have No Funding Bias
The observation that prominent quantum computing skeptics generally have no financial stake in the technology's failure and thus no systematic incentive to be pessimistic, in contrast to proponents who typically receive funding contingent on optimistic projections.
This asymmetry in incentives is crucial for evaluating the credibility of competing claims. Skeptics risk only their reputation by being wrong; proponents risk their funding, careers, and organizational commitments.
SPAC Risks in QC
The use of Special Purpose Acquisition Companies to take quantum computing startups public, bypassing the traditional IPO process's scrutiny and allowing companies with minimal revenue to access public markets.
SPACs enabled several quantum computing companies to go public at valuations disconnected from their financial fundamentals. The SPAC structure allowed forward-looking projections in marketing materials that would not be permitted in a traditional IPO prospectus.
Example: IonQ and Rigetti both went public via SPAC mergers in 2021-2022, reaching valuations in the billions despite combined annual revenues of less than $30 million.
Startup Pitch Exaggeration
The tendency of quantum computing startups to overstate the near-term potential, market size, and competitive advantages of their technology in fundraising materials to attract venture capital.
Startup pitch exaggeration is driven by the structure of venture capital, where companies compete for funding by making the most compelling case possible. In quantum computing, where technical verification is difficult for non-expert investors, this creates fertile ground for hype.
Example: Multiple quantum startup pitch decks from 2020-2023 projected achieving quantum advantage within 2-3 years and capturing significant market share in drug discovery or finance — none have delivered on these claims.
Steam Engine as GPT
The steam engine as an archetypal General Purpose Technology that was broadly applicable (mining, manufacturing, transportation), improved continuously (Newcomen to Watt to high-pressure), and spawned complementary innovations (railroads, factories, steamships).
The steam engine provides a useful contrast to quantum computing because its GPT characteristics were evident almost immediately: it worked, it had paying customers, and it improved steadily with each generation.
String Theory Career Warning
The cautionary example of string theory, where decades of intellectual investment by thousands of physicists produced elegant mathematics but no experimentally testable predictions, leaving many careers built on an unfalsifiable framework.
The string theory parallel is particularly apt because it involves similar dynamics: brilliant people, beautiful theory, enormous intellectual investment, strong insider incentive structures, and an increasingly uncertain connection to physical reality.
Student Advising Ethics
The ethical obligations of academic advisors when counseling students considering quantum computing careers, including the duty to present realistic assessments of career prospects and the field's scientific uncertainty.
Ethical advising requires honesty about risk. An advisor who encourages a student to pursue a quantum computing PhD without discussing the possibility that the field may not deliver commercial applications is failing in their ethical duty.
Successful Physics Bets
Historical cases where investment in physics-based technologies produced transformative commercial returns, including the transistor, laser, fiber optics, GPS atomic clocks, and MRI — all of which demonstrated early commercial viability.
Studying successful physics bets reveals a common pattern: the underlying technology worked reliably at small scale early on, costs declined predictably, and commercial applications emerged within a decade of laboratory demonstration. Quantum computing has matched none of these patterns.
Sunk Cost Escalation Loop
A reinforcing loop where past investment creates political and psychological pressure to invest more, which increases the past investment that creates future pressure. Each round of investment makes the next round harder to refuse.
This loop operates at every level — individual researchers whose careers depend on quantum computing, companies that have publicly committed to quantum roadmaps, and governments that have made quantum computing a national priority.
Sunk Cost Fallacy
The irrational tendency to continue investing in a project because of previously invested resources rather than based on future expected returns. Past expenditures are economically irrelevant to future investment decisions.
The sunk cost fallacy is arguably the most damaging bias in quantum computing, affecting individuals, corporations, and governments. The larger the past investment, the harder it becomes to walk away — creating a ratchet effect that escalates commitment regardless of results.
Example: "We've spent $2 billion on quantum computing and haven't seen returns, so we need to invest another $500 million to protect our investment" is textbook sunk cost reasoning. The $2 billion is gone regardless of the next decision.
See also: Sunk Cost Trap, Sunk Cost Escalation Loop
Sunk Cost Trap
The cognitive error of continuing to invest in quantum computing because of previously invested capital rather than based on the probability of future returns. Past spending is irrelevant to future investment decisions.
The sunk cost trap is particularly powerful in quantum computing because investments are very large, highly publicized, and tied to organizational prestige and career reputations. Walking away from a multibillion-dollar program is psychologically and politically difficult regardless of its prospects.
Example: "We've invested $10 billion in quantum computing — we can't stop now" is a sunk cost argument. The correct question is: "Given what we know today, would we invest the next $1 billion?"
See also: Sunk Cost Fallacy, Sunk Cost Escalation Loop
Superconducting Approach
A quantum computing platform using circuits made from superconducting materials (typically aluminum on silicon) cooled to near absolute zero, where electrical current flows without resistance. Qubits are formed from Josephson junctions that create quantized energy levels.
Superconducting qubits are the most mature platform, led by IBM and Google, but they suffer from short coherence times (50-100 microseconds), high error rates, and the cryogenic infrastructure requirements discussed throughout this course.
Example: Google's Sycamore processor (53 superconducting qubits) and IBM's Eagle/Condor series represent the current state of this approach, with gate error rates around 0.5-1%.
See also: Cryogenic Cooling Requirement, Near Absolute Zero Temps
Superposition Explained
The quantum mechanical property allowing a qubit to exist in a combination of both 0 and 1 states simultaneously, with each state having a probability amplitude. Superposition collapses to a definite value upon measurement.
Superposition is often cited as the source of quantum computing's power, with popular accounts claiming qubits can "try all answers at once." This oversimplification obscures the reality that extracting useful information from superposition requires carefully designed algorithms, of which very few exist.
Example: A coin spinning in the air is sometimes used as an analogy — it is neither heads nor tails until it lands — though this analogy is imprecise because superposition involves complex probability amplitudes, not simple 50/50 odds.
Survivorship Bias
The error of drawing conclusions only from successful cases while ignoring failures. In quantum computing, investment narratives highlight the few potential applications while ignoring the vast number of problems where quantum offers no advantage.
Survivorship bias in quantum computing is reinforced by publication bias (journals prefer positive results), conference selection (events feature successes), and media coverage (breakthroughs are news; failures are not).
Example: Presentations on quantum computing applications typically highlight drug discovery, optimization, and cryptography while ignoring the thousands of commercial computing tasks where quantum offers zero advantage.
Systems Thinking
An analytical approach that examines how components of a system interrelate and influence each other over time, focusing on feedback loops, delays, and emergent behavior rather than individual events in isolation.
Systems thinking explains why quantum computing investment continues despite poor results. The quantum computing ecosystem is a complex system with multiple reinforcing feedback loops that sustain investment independently of technical progress.
See also: Causal Loop Diagrams, Reinforcing Feedback Loop
Technology Adoption Curves
S-shaped curves describing how new technologies move from innovation through early adoption, rapid growth, and saturation. Quantum computing has not yet entered the early adoption phase by any standard definition.
Applying technology adoption models to quantum computing reveals that the technology is still in the pre-commercial research phase, far earlier than the investment levels would suggest for a technology at this stage.
See also: Crossing the Chasm, Gartner Hype Cycle
Technology Bubble Dynamics
The self-reinforcing cycle where investment inflates expectations, which attracts more investment, creating asset valuations disconnected from fundamentals. Bubbles persist as long as new capital inflows exceed the rate of disillusionment.
Understanding bubble dynamics helps explain how quantum computing investment can persist despite poor technical progress. The inflow of new capital (government, VC, corporate) sustains the ecosystem even without commercial returns.
Technology Due Diligence
A systematic process for evaluating a technology investment, including independent technical assessment, financial analysis, competitive landscape review, management evaluation, and risk quantification.
Standard technology due diligence processes are often inadequate for quantum computing because evaluators lack the physics background to assess technical claims independently. This information asymmetry advantages insiders.
Technology Forecasting
The practice of predicting the development trajectory and timeline of new technologies, using methods ranging from expert surveys to trend extrapolation to scenario analysis. Technology forecasting has a poor track record, especially for paradigm-shifting technologies.
Quantum computing forecasts deserve particular scrutiny because they combine all the factors that make technology forecasting unreliable: fundamental scientific uncertainty, strong incentive biases among forecasters, and no historical precedent for the specific type of technology being developed.
Technology Readiness Levels
A systematic scale from TRL 1 (basic principles observed) to TRL 9 (proven through successful mission operations), originally developed by NASA to assess the maturity of new technologies.
TRL provides an objective, standardized framework for assessing where quantum computing actually stands versus where marketing materials imply it stands.
See also: TRL Scale Explained, QC Stuck at TRL 2-3
The 10 Required Breakthroughs
A framework identifying approximately 10 major technical breakthroughs that must all occur for quantum computing to become commercially viable, including error rates, qubit count, coherence, connectivity, cryogenics, algorithms, and cost.
Listing the required breakthroughs explicitly makes the magnitude of the remaining challenge visible and enables joint probability analysis of commercial viability.
See also: Joint Probability Problem, All Must Happen Together
Theoretical Promise of QC
The hypothesis that quantum computers could solve certain problems exponentially faster than classical computers by exploiting superposition, entanglement, and quantum interference to explore solution spaces more efficiently.
This promise drives billions in investment, but it remains largely theoretical. The gap between what quantum computers could do in principle and what they can do in practice is the central subject of this course.
Example: Shor's algorithm theoretically factors large integers exponentially faster than the best known classical algorithms, but no quantum computer has factored a number larger than 21 using it.
Theranos Lessons
The case of Theranos, where a charismatic founder raised billions for blood-testing technology that did not work as claimed, illustrating the dangers of hype, secrecy, and inadequate technical due diligence in technology investment.
While quantum computing companies are not committing fraud, the Theranos case illustrates how enthusiasm, authority bias, and information asymmetry can sustain massive investment in technology that does not work.
Example: Theranos reached a $9 billion valuation while its technology consistently failed to perform as claimed — a pattern enabled by secrecy, hype, and investors' failure to demand independent technical verification.
Timeline Pattern Analysis
The systematic study of historical quantum computing predictions and their outcomes, revealing a consistent pattern of overoptimistic timelines that slip repeatedly without accountability.
Tracking predictions against outcomes is one of the most powerful tools for assessing technology claims. In quantum computing, this analysis reveals that forecasted milestones routinely arrive 5-10 years late, if at all.
Example: In 2017, several companies predicted fault-tolerant quantum computers by 2023. In 2023, they revised those predictions to 2028-2030. The pattern has repeated across every five-year period since the 1990s.
Topological Approach
A proposed quantum computing platform that would encode information in exotic quasiparticles called non-Abelian anyons, theoretically providing inherent error protection through the topological properties of the quantum states.
Topological qubits are the "holy grail" of quantum computing because they would theoretically require far less error correction. However, the required quasiparticles have never been conclusively demonstrated to exist in a controllable form, making this approach the most speculative.
Example: Microsoft has invested heavily in topological qubits for over 20 years based on Majorana fermions. As of 2025, no topological qubit has been demonstrated, and a key 2018 paper claiming evidence was retracted due to data manipulation.
Total $100B+ Invested
The estimated cumulative global investment in quantum computing from all sources — government, corporate, and venture capital — exceeding $100 billion through 2025.
This figure represents the scale of resources committed to a technology with zero commercial returns. It includes government programs (U.S., China, EU, UK, Japan), corporate R&D (IBM, Google, Microsoft, Amazon), and venture capital funding of startups.
Total Cost of Ownership
The complete lifecycle cost of a quantum computing system, including acquisition, installation, facilities, energy, cooling, maintenance, calibration, staffing, and classical computing infrastructure for control and error correction.
TCO analyses for quantum computing are rarely published because the numbers are discouraging. A realistic TCO must include the enormous classical computing infrastructure required to control, calibrate, and error-correct the quantum system.
Example: The TCO for a 100-qubit quantum computing system over 5 years may exceed $50 million when all costs are included, while the system cannot outperform a $10,000 classical server on any practical task.
Transferable Skills Debate
The question of whether skills acquired in quantum computing research (physics, applied math, programming, systems engineering) would transfer effectively to other fields if quantum computing careers decline.
While quantum computing researchers have strong technical backgrounds, the most specialized skills (quantum error correction, qubit fabrication, quantum algorithm design) have limited applicability outside the field. General skills (Python, linear algebra, statistical analysis) are broadly transferable but do not require a quantum computing PhD.
Transistor as GPT
The transistor as a GPT that enabled the entire digital revolution, demonstrating broad applicability (computing, communications, consumer electronics), continuous improvement (Moore's Law), and massive complementary innovation.
The transistor is the most relevant GPT comparison because it is the technology quantum computing would theoretically supplement or replace. The transistor demonstrated clear, measurable, and continuous improvement from its inception — a trajectory quantum computing has not matched.
Transistor Investment Payoff
The transistor's trajectory from Bell Labs invention (1947) to commercial products (hearing aids by 1952, radios by 1954), demonstrating rapid translation from physics breakthrough to commercial returns.
The transistor provides the gold standard for physics-to-product timelines. Within 7 years of invention, transistors were in consumer products. Quantum computing, after 40+ years, has zero consumer products.
Example: The transistor moved from lab demonstration to mass-market consumer radios (the Regency TR-1) in just 7 years, generating commercial revenue almost immediately. After 30 years, quantum computing has no equivalent commercial product.
Trapped Ion Approach
A quantum computing platform that uses individual charged atoms (ions) suspended in electromagnetic fields as qubits, with laser pulses performing quantum gates. Offers longer coherence times and higher gate fidelities than superconducting approaches.
Trapped ion systems have demonstrated the highest-quality qubits but face severe scaling challenges. Operations are slow compared to superconducting systems, and trapping large numbers of ions in a single chain becomes physically impractical beyond roughly 30-50 qubits.
Example: IonQ's systems use ytterbium ions and achieve two-qubit gate fidelities above 99%, but gate speeds are roughly 1,000 times slower than superconducting gates, limiting computational throughput.
TRL Scale Explained
The nine-level Technology Readiness Level scale: TRL 1 (basic principles), TRL 2 (concept formulated), TRL 3 (proof of concept), TRL 4 (lab validation), TRL 5 (relevant environment), TRL 6 (relevant demonstration), TRL 7 (prototype), TRL 8 (complete and qualified), TRL 9 (proven operational).
Understanding the full TRL scale helps investors and policymakers accurately assess where quantum computing stands relative to deployment readiness. Most quantum computing falls in the TRL 2-4 range, despite marketing that implies TRL 6+.
Trough of Disillusionment
The phase of the Gartner Hype Cycle where reality fails to meet inflated expectations, leading to negative press, funding reductions, company failures, and a more sober reassessment of the technology's actual potential.
Evidence suggests quantum computing may be entering the Trough of Disillusionment as of 2025-2026, with declining VC funding, SPAC stock declines, and increasing skepticism in mainstream technology media.
Example: The decline in quantum computing VC funding from 2022 to 2025, combined with stock price drops for IonQ and Rigetti, may signal the beginning of the Trough of Disillusionment.
Unfalsifiable Timelines
Predictions about quantum computing progress structured in ways that cannot be proven wrong: either sufficiently vague ("in the coming years"), sufficiently far out ("by 2040"), or accompanied by enough caveats to allow retroactive reinterpretation.
Unfalsifiable claims are scientifically meaningless. Critical evaluation requires asking: "What specific outcome, by what specific date, would prove this prediction wrong?" If the forecaster cannot answer, the prediction has no informational value.
VC Funding Frenzy
The surge of venture capital investment into quantum computing startups from roughly 2018-2023, driven by hype, FOMO, and the availability of cheap capital, resulting in overvalued companies with no clear path to revenue.
The VC frenzy illustrates how financial market dynamics can sustain investment in technologies far beyond what technical fundamentals justify. Many quantum startups raised hundreds of millions of dollars without demonstrating any competitive advantage over classical computing.
Example: Over $3 billion in venture capital flowed into quantum computing startups in 2021-2022 alone, a period when interest rates were low and technology valuations were inflated across the sector.
Venture Capital Loss Rates
The percentage of venture capital investments in quantum computing expected to result in total or near-total loss, estimated by industry analysts at 70-90% — significantly higher than the 60-70% typical of early-stage technology investing.
Higher loss rates in quantum computing reflect the additional layer of fundamental physics risk that does not exist in most technology investments. Software and biotech startups can fail for market or execution reasons, but they rarely fail because the underlying science does not work.
What If the Field Contracts?
The scenario analysis of career and economic consequences if quantum computing fails to achieve commercial viability and investment declines significantly, as has occurred in other overhyped technology fields.
Field contraction is a realistic scenario that students and early-career researchers should consider. Previous examples (superconducting electronics, molecular computing, analog neural networks) show that field contractions can be sudden and severe.
What Is a Qubit
A quantum bit — the fundamental unit of quantum information. Unlike a classical bit, a qubit can exist in a superposition of 0 and 1 states simultaneously, represented mathematically as \(\alpha|0\rangle + \beta|1\rangle\) where \(\alpha\) and \(\beta\) are complex amplitudes.
Qubits are the building blocks of quantum computing, but their extreme fragility is a central reason quantum computers remain impractical. The gap between theoretical qubit capabilities and engineering reality is a recurring theme in this course.
Example: A superconducting qubit typically maintains its quantum state for less than 100 microseconds before decoherence destroys the information it holds.
What QC Cannot Do
Quantum computers cannot speed up all computations — only those with specific mathematical structure exploitable by quantum algorithms. They cannot replace classical computers for general-purpose tasks, and they offer no advantage for most everyday computing workloads.
Understanding what quantum computing cannot do is as important as understanding what it theoretically could do. Many investment pitches gloss over these limitations, creating unrealistic expectations about quantum computing's commercial potential.
Example: Quantum computers offer no speedup for tasks like word processing, web browsing, database queries, or most machine learning training — the vast majority of commercial computing workloads.
What QC Could Supposedly Do
The set of applications frequently cited by proponents: breaking encryption, simulating molecules for drug discovery, optimizing logistics, improving machine learning, and modeling financial markets.
Skeptical analysis reveals that most of these applications either require fault-tolerant quantum computers that do not exist, offer only modest speedups over classical methods, or solve problems that classical AI is already addressing effectively.
Example: Quantum chemistry simulation is often cited as the "killer app," but the largest molecule simulated on a quantum computer remains far smaller than what classical methods like density functional theory can handle.
When Scientists Mislead
Cases where scientists — whether through fraud, self-deception, or excessive optimism — have misrepresented the state of their research, leading to wasted investment and damaged trust in science.
Scientists are human and subject to the same cognitive biases as everyone else. The pressure to secure funding, the desire for recognition, and genuine self-deception can lead even honest researchers to present results more favorably than warranted.
Example: The retraction of a major Microsoft-funded paper claiming evidence of Majorana fermions (the basis for topological quantum computing) due to data manipulation illustrates how the pressure to produce results can corrupt scientific integrity.
When to Cut Losses
The decision framework for determining when continued investment in quantum computing is no longer rationally justified, based on probability updates from new technical evidence, timeline slippage, and opportunity cost analysis.
Knowing when to stop is as important as knowing when to start. The sunk cost fallacy makes this decision psychologically difficult, but rigorous expected value analysis can provide a rational basis for exit decisions.
Who Pays for QC?
The unresolved question of quantum computing's commercial customer base. No category of customer has demonstrated willingness to pay for quantum computation at prices that could sustain the industry's enormous cost structure.
This is the most basic business question and the one most consistently avoided in quantum computing investment discussions. "Build it and they will come" is not a business strategy; it is a hope.
Why Skeptics Are Ignored
The structural reasons why quantum computing skeptics receive less attention than proponents: they have no marketing budgets, conference platforms are controlled by proponents, media prefers optimistic stories, and funders prefer optimistic forecasts.
The silencing of skeptics is itself evidence of a dysfunctional information ecosystem. In healthy scientific discourse, critics play a valued role. In quantum computing, the incentive structure systematically marginalizes critical voices.
Window Is Already Closing
The observation that the period of vulnerability to quantum cryptographic attacks is narrowing as post-quantum cryptography standards are deployed, potentially closing the window before quantum computers become powerful enough to exploit it.
If post-quantum cryptography is widely deployed by 2030 and quantum computers capable of breaking current encryption do not emerge until 2040 or later (if ever), the "harvest now, decrypt later" threat applies only to a finite and shrinking window of legacy encrypted data.
Wiring and Control Problem
The challenge of routing the electrical control lines, readout lines, and calibration signals needed for each qubit from room temperature electronics down to the millikelvin operating environment. Current systems require 2-5 physical wires per qubit.
With thousands of qubits, the wiring becomes a physical bottleneck — there is limited space inside dilution refrigerators, and each wire conducts unwanted heat into the quantum system. This "wiring crisis" has no proven solution at million-qubit scale.
Example: A 1,000-qubit system may need 3,000 or more physical connections between room temperature and millikelvin stages, each carrying heat that must be removed by limited cryogenic cooling power.
Writing a Critical Review
The skill of producing a structured, evidence-based critique of a quantum computing paper, report, or announcement, distinguishing between what is claimed, what is supported by evidence, and what is speculated.
Critical review writing forces disciplined thinking by requiring explicit identification of evidence, assumptions, and logical gaps. It is a foundational academic skill with direct practical applications in investment analysis.
Writing an Executive Brief
The skill of condensing complex quantum computing analysis into a concise (2-4 page) document for senior decision-makers, emphasizing key findings, risk assessment, and actionable recommendations.
Executive briefs are a critical output format because most investment and policy decisions about quantum computing are made by people who will not read a full technical analysis. The brief must convey essential risk information clearly and concisely.
"3-5 Years Away" Pattern
The recurring phenomenon where quantum computing breakthroughs are perpetually predicted to be "3-5 years away," a timeframe long enough to seem plausible but short enough to sustain investment and enthusiasm.
This pattern is not unique to quantum computing — it appears in fusion energy, AGI, autonomous vehicles, and other technologies with strong investment incentives and weak feedback loops. Recognizing this pattern is a core critical thinking skill taught in this course.
Example: In 2015, practical quantum computing was "5-10 years away." In 2020, it was "5-10 years away." In 2025, it is "5-10 years away."
40 Years of Promises
The observation that quantum computing has been promising transformative results since Feynman's 1981 proposal, yet after more than four decades, the technology has not delivered a single commercially viable application.
This long timeline of unfulfilled promises should inform probability estimates about future claims. A technology that has been "almost ready" for 40 years requires extraordinary evidence before its latest predictions should be believed.
Zero Commercial ROI by 2025
The empirical fact that no quantum computing company or project has generated a positive return on investment from quantum computation services as of 2025, despite cumulative global investment exceeding $100 billion.
This is the most damning single data point in the quantum computing investment debate. After four decades of research and enormous capital deployment, the technology has produced exactly zero commercial returns. Any other technology with this track record would face far greater scrutiny.
Zero Revenue Generated
The fact that quantum computing has generated essentially no commercial revenue from quantum computation services (as opposed to government research grants, consulting fees, or cloud access for educational purposes).
Some quantum companies report modest revenue, but virtually all of it comes from government grants, educational access, or consulting — not from customers paying for quantum computations that outperform classical alternatives.