Cognitive Biases in Quantum Computing Investment
Summary
This chapter examines the psychological biases that sustain investment in quantum computing despite absent returns. We cover sunk cost fallacy, confirmation bias, anchoring bias, bandwagon effect, Dunning-Kruger effect, optimism bias, authority bias, FOMO, survivorship bias, narrative fallacy, groupthink, motivated reasoning, and information asymmetry. We show how these biases are not independent but compound each other, creating a nearly impenetrable wall of irrational optimism in technology forecasting. Students will be able to identify these biases in quantum computing press releases, investor pitches, and their own thinking.
Concepts Covered
This chapter covers the following 16 concepts from the learning graph:
- Cognitive Bias Overview
- Sunk Cost Fallacy
- Confirmation Bias
- Anchoring Bias
- Bandwagon Effect
- Dunning-Kruger in QC
- Optimism Bias
- Authority Bias
- FOMO in QC Investment
- Survivorship Bias
- Narrative Fallacy
- Groupthink
- Motivated Reasoning
- Information Asymmetry
- Bias in Tech Forecasting
- How Biases Compound
Prerequisites
This chapter builds on concepts from:
Fermi Welcomes You!
Welcome, fellow investigators! We've examined the physics barriers, the financial numbers, and the assessment frameworks. Every rational analysis says quantum computing investment is hard to justify. So why does the money keep flowing? The answer isn't in the physics — it's in the psychology. This chapter is about the bugs in human thinking that quantum computing hype exploits. But does the math check out? Let's find out!
Learning Objectives
After completing this chapter, you will be able to:
- Define cognitive bias and explain why intelligent people are not immune to biased thinking
- Identify at least twelve specific cognitive biases that operate in quantum computing investment decisions
- Explain how information asymmetry between physicists and investors creates exploitable gaps
- Analyze quantum computing press releases, investor pitches, and news articles for embedded biases
- Demonstrate how individual biases compound into a system of mutually reinforcing irrationality
- Apply bias detection techniques to your own thinking about emerging technology claims
What Are Cognitive Biases?
A cognitive bias is a systematic deviation from rational judgment that arises from the brain's use of mental shortcuts (heuristics). Biases are not mistakes made by careless thinkers — they are predictable patterns of error built into the architecture of human cognition. Nobel laureates, seasoned investors, and brilliant physicists are all susceptible, because biases operate below the level of conscious reasoning.
The field of cognitive bias research, pioneered by Daniel Kahneman and Amos Tversky in the 1970s, has identified over 180 distinct biases. Not all are relevant to technology investment, but a core set of biases operates powerfully in the quantum computing space. These biases do not merely coexist — they reinforce each other, creating feedback loops that make rational assessment extraordinarily difficult.
The biases we will examine fall into four functional categories:
| Category | Biases | Function in QC Hype |
|---|---|---|
| Commitment escalation | Sunk cost fallacy, motivated reasoning | Keep money flowing after negative signals |
| Information distortion | Confirmation bias, anchoring, narrative fallacy | Filter evidence to support optimistic conclusions |
| Social pressure | Bandwagon effect, authority bias, FOMO, groupthink | Make dissent costly and conformity rewarding |
| Knowledge illusion | Dunning-Kruger, optimism bias, survivorship bias | Create false confidence in uninformed assessments |
Part I: Commitment Escalation Biases
Sunk Cost Fallacy
The sunk cost fallacy occurs when past, irrecoverable expenditures influence decisions about future spending. Rationally, only future costs and benefits should matter — what you've already spent is gone regardless. But psychologically, abandoning an investment feels like "wasting" the money already spent, even though continuing may waste even more.
In quantum computing, the sunk cost fallacy operates at every level:
- Government level: "We've committed $1.2 billion through the National Quantum Initiative — we can't defund it now." The political cost of admitting a national initiative was premature exceeds the financial cost of continuing it.
- Corporate level: "IBM has invested billions in its quantum roadmap and hired hundreds of PhDs. Shutting down the program would require writing off the investment and laying off the team." The career consequences for executives who championed the program make continuation the path of least resistance.
- Venture capital level: "Our fund has $200 million in quantum computing positions. If we mark them down to zero, our fund performance collapses." Holding losing positions at inflated valuations preserves the appearance of performance.
- Individual researcher level: "I've spent my entire PhD and postdoc on quantum error correction. If the field contracts, my expertise becomes worthless." Career sunk costs create powerful incentives for optimism.
Bias Alert
The sunk cost fallacy is the single most powerful bias sustaining quantum computing investment. Every rational framework — ROI, NPV, expected value — evaluates future outcomes only. But the human brain cannot ignore the past. When someone says "we've invested too much to stop now," they are making a textbook sunk cost error. The rational response is always: "Regardless of what we've spent, does the next dollar have positive expected value?" If the answer is no, stop.
Motivated Reasoning
Motivated reasoning occurs when the desire to reach a specific conclusion shapes how we evaluate evidence. Unlike conscious dishonesty, motivated reasoning operates unconsciously — the reasoner genuinely believes they are thinking objectively, but their evaluation process is systematically skewed toward the desired outcome.
In quantum computing, motivated reasoning is pervasive because nearly everyone in the ecosystem has a reason to want the technology to succeed:
| Stakeholder | Motivation | How Reasoning Is Distorted |
|---|---|---|
| Researchers | Career funding, publication record | Interpret incremental results as confirming the overall thesis |
| Startup founders | Equity value, reputation | Emphasize best-case scenarios; minimize technical barriers |
| VCs | Fund returns, deal flow reputation | Seek confirming analogies (internet, transistors); dismiss disconfirming ones (fusion, flying cars) |
| Corporate executives | Internal prestige, budget authority | Frame speculative research as "strategic positioning" |
| Government officials | Political narrative, job creation | Cite national security to avoid cost-benefit analysis |
| Journalists | Compelling stories, reader engagement | Report breakthroughs uncritically; skepticism isn't "exciting" |
The challenge with motivated reasoning is that it is invisible to the person engaging in it. A researcher who sees a modest improvement in qubit fidelity and describes it as "a major step toward fault-tolerant quantum computing" may genuinely believe that characterization — because their career depends on it being true.
Part II: Information Distortion Biases
Confirmation Bias
Confirmation bias is the tendency to seek, interpret, and remember information that confirms pre-existing beliefs while ignoring or discounting contradictory evidence. It is arguably the most studied cognitive bias, and it operates powerfully in quantum computing.
Consider how confirmation bias manifests in practice:
- Seeking confirming information: An investor who believes in quantum computing reads every press release about qubit improvements and hardware milestones. They do not read papers by Gil Kalai or Michel Dyakonov arguing that fault-tolerant quantum computing may be impossible.
- Interpreting ambiguous evidence favorably: When IBM announces a 1,121-qubit chip, a confirmation-biased observer interprets this as "progress toward useful quantum computing." A neutral observer notes that the chip's error rates make it useless for fault-tolerant computation and that qubit count alone is meaningless without fidelity.
- Remembering hits, forgetting misses: People remember Google's "quantum supremacy" announcement but forget that the "supremacy" benchmark was immediately challenged and had no commercial relevance.
Confirmation bias is particularly dangerous in quantum computing because the field generates a steady stream of genuine but commercially irrelevant progress. Better error rates, more qubits, longer coherence times — all real achievements, all confirming to the biased observer, and all insufficient for commercial viability.
Anchoring Bias
Anchoring occurs when an initial piece of information disproportionately influences subsequent judgments. In quantum computing, the most common anchors are:
- Theoretical speedup claims: "Quantum computers are exponentially faster than classical computers!" This anchor (which is true only for a tiny class of problems) sets expectations that every quantum computing application will deliver astronomical speedups.
- Market size projections: McKinsey's $450-850 billion projection anchors investors' sense of the opportunity, even though the projection is based on assumptions about physics breakthroughs that haven't occurred.
- Qubit count milestones: IBM's announcement of a 1,121-qubit chip anchors the perception that quantum computing is "almost there," even though the relevant metric (error-corrected logical qubits) remains at approximately zero.
Once anchored, people adjust insufficiently away from the anchor. An investor who starts from "quantum computing is a \(450 billion opportunity" and then encounters skeptical evidence may revise downward — but typically arrives at something like "\)100 billion," which is still far above any figure supportable by current evidence.
Key Insight
Anchoring explains why quantum computing investment persists despite negative evidence. The anchors — "exponential speedup," "$450 billion market," "1,000+ qubits" — are set high. Skeptical evidence causes adjustment, but never enough. The anchored investor concludes "even if it's only 10% of the projected market, that's $45 billion — still huge!" But 10% of a fictional number is still fictional. The correct anchor is zero commercial revenue, and you should adjust upward from there only with evidence.
Narrative Fallacy
The narrative fallacy, described by Nassim Nicholas Taleb, is the human tendency to construct coherent stories from random or loosely connected facts, and then to mistake the story for evidence. Humans are story-telling animals — we understand the world through narratives, and we find narrative explanations more persuasive than statistical evidence.
The quantum computing industry runs on narrative:
- The Hero's Journey: "Brilliant physicists are on the verge of a breakthrough that will transform civilization." This narrative is emotionally compelling and deeply satisfying, even though it contains no evidence about timelines or probability.
- The Race Narrative: "The US and China are in a quantum computing race. Whoever gets there first wins." This narrative creates urgency and frames skepticism as unpatriotic, even though the "race" presupposes that the finish line exists and is reachable.
- The Inevitable Progress Narrative: "Computing always gets better. Quantum computing is the next frontier." This narrative maps quantum computing onto the arc of classical computing progress, even though the physics barriers are fundamentally different.
Each narrative is more persuasive than the underlying evidence warrants, because stories bypass analytical scrutiny. When an investor hears "this startup is building the future of computing," the narrative engages their imagination before their analytical faculties can intervene.
Part III: Social Pressure Biases
Bandwagon Effect
The bandwagon effect occurs when the perceived popularity of a belief or behavior increases its adoption. In investment, this manifests as: "Google, IBM, and Microsoft are all investing in quantum computing — they must know something I don't."
The bandwagon effect in quantum computing is reinforced by several factors:
- Prestige of participants: When Nobel laureates, elite universities, and trillion-dollar corporations are involved, skepticism feels presumptuous
- Media coverage volume: The sheer quantity of positive coverage creates an impression of momentum
- Conference circuits: Quantum computing conferences, workshops, and industry events create echo chambers where optimism is the social norm
- Social proof in investing: "If Andreessen Horowitz invested, it must be a good deal" — even though VC firms regularly lose money on individual investments
The bandwagon effect is particularly insidious because it can create genuine consensus without genuine evidence. When enough smart people believe something, the belief itself becomes evidence for others — a circular dynamic that can sustain incorrect beliefs indefinitely.
Authority Bias
Authority bias leads people to over-weight the opinions of perceived authorities, regardless of whether those authorities have relevant expertise or conflicted incentives. In quantum computing:
- Nobel laureates who endorse quantum computing are cited as evidence of viability — even though their expertise in fundamental physics does not translate to expertise in engineering scalability or commercial viability
- CEOs of quantum computing companies are treated as objective sources — even though they have enormous financial incentives to be optimistic
- University professors with quantum computing grants are cited as neutral experts — even though their funding depends on continued government investment in the field
The critical question that authority bias prevents people from asking is: "Does this authority have the right kind of expertise, and do they have conflicted incentives?"
| Authority | Perceived Expertise | Actual Expertise Gap | Incentive Conflict |
|---|---|---|---|
| Physics Nobel laureate | "Quantum computing will work" | May not understand engineering scalability or economics | Prestige from being associated with "the next big thing" |
| QC company CEO | "We're on track to commercialize" | Technical expertise may be genuine | Massive financial incentive to project optimism |
| VC partner | "This is a trillion-dollar opportunity" | Rarely understands quantum physics | Fund performance depends on maintaining valuations |
| Government official | "National security requires quantum investment" | Almost never understands the physics or economics | Political incentive to appear forward-looking |
FOMO in Quantum Computing Investment
Fear of Missing Out (FOMO) drives investment decisions based on anxiety about being left behind rather than analysis of expected returns. In quantum computing, FOMO operates at multiple scales:
- Individual investor FOMO: "What if quantum computing is the next internet and I missed it?"
- Corporate FOMO: "Our competitors have quantum computing programs — we'll be left behind if we don't start one"
- National FOMO: "China is investing $15 billion in quantum computing — if we don't match it, we'll lose the technology race"
FOMO is amplified by the geopolitical framing of quantum computing, which transforms a technology investment question into a national security question. Once framed as "falling behind China," the investment decision bypasses cost-benefit analysis entirely — because the cost of "losing" is framed as catastrophic and unquantifiable.
Fermi's Tip
When you feel FOMO about a technology investment, ask yourself: "Am I afraid of missing a specific, quantified opportunity — or am I afraid of a vague, unnamed catastrophe?" If the answer is the latter, FOMO is driving the decision, not analysis. The cure for FOMO is specificity: name the opportunity, estimate the probability, and calculate the expected value. If you can't do that, you're not investing — you're panicking.
Groupthink
Groupthink occurs when a cohesive group prioritizes consensus over critical evaluation, suppressing dissent and discouraging the exploration of alternative viewpoints. Irving Janis identified groupthink as a factor in major policy failures, including the Bay of Pigs invasion and the Challenger disaster.
The quantum computing community exhibits classic groupthink symptoms:
- Illusion of invulnerability: "Quantum computing is inevitable — it's just a matter of time" (dismisses the possibility of failure)
- Collective rationalization: "Error rates are improving; coherence times are increasing; we just need to stay the course" (reframes lack of commercial progress as temporary)
- Stereotyping of outsiders: "Skeptics just don't understand quantum mechanics" (dismisses criticism by questioning critics' competence rather than engaging their arguments)
- Self-censorship: Researchers who privately doubt the timeline publicly remain silent to avoid career consequences
- Illusion of unanimity: The absence of public dissent is interpreted as agreement, even though silence may reflect suppression rather than consensus
- Direct pressure on dissenters: Researchers who publish skeptical analyses report being excluded from conferences, denied grants, or criticized by colleagues
The quantum computing research community is small enough that groupthink can operate effectively. Researchers attend the same conferences, review each other's papers, and compete for the same grants. The social cost of public skepticism is high; the social reward for optimism is significant.
Diagram: Cognitive Bias Network in QC Investment
Cognitive Bias Network in Quantum Computing Investment
Type: graph-model
sim-id: cognitive-bias-network
Library: vis-network
Status: Specified
Learning Objective: Analyze how individual cognitive biases interact and reinforce each other in quantum computing investment, identifying the strongest amplification pathways (Bloom's Level 4: Analyze — examine, differentiate, attribute).
Instructional Rationale: An interactive network graph is appropriate because the Analyze/examine objective requires learners to trace relationships between concepts. The force-directed layout reveals clusters of mutually reinforcing biases, and click-to-highlight reveals amplification chains that linear text cannot convey.
Graph Structure:
Nodes (16 biases, sized by number of connections): 1. "Sunk Cost Fallacy" (red, large) 2. "Confirmation Bias" (red, large) 3. "Anchoring Bias" (orange, medium) 4. "Bandwagon Effect" (orange, medium) 5. "Dunning-Kruger" (yellow, medium) 6. "Optimism Bias" (orange, medium) 7. "Authority Bias" (orange, medium) 8. "FOMO" (red, large) 9. "Survivorship Bias" (yellow, small) 10. "Narrative Fallacy" (orange, medium) 11. "Groupthink" (red, large) 12. "Motivated Reasoning" (red, large) 13. "Information Asymmetry" (purple, large) 14. "Bias in Tech Forecasting" (gray, medium — outcome node) 15. "How Biases Compound" (gray, large — outcome node) 16. "Cognitive Bias Overview" (blue, central hub)
Edges (reinforcement relationships, with labels): - Sunk Cost → Motivated Reasoning ("past spending motivates optimism") - Confirmation Bias → Anchoring ("confirming info reinforces anchors") - Confirmation Bias → Narrative Fallacy ("stories filter for confirming evidence") - Bandwagon Effect → FOMO ("popularity creates fear of missing out") - Bandwagon Effect → Groupthink ("herd behavior suppresses dissent") - Dunning-Kruger → Authority Bias ("overconfident non-experts defer to wrong authorities") - Optimism Bias → Confirmation Bias ("optimists seek confirming evidence") - Authority Bias → Bandwagon Effect ("prestigious endorsement drives adoption") - FOMO → Sunk Cost ("fear-driven investment creates sunk costs") - Groupthink → Motivated Reasoning ("group consensus motivates conformity") - Information Asymmetry → Dunning-Kruger ("knowledge gap creates false confidence") - Information Asymmetry → Authority Bias ("non-experts must rely on authorities") - Motivated Reasoning → Confirmation Bias ("desired conclusions filter evidence") - Survivorship Bias → Optimism Bias ("only successes are visible") - Narrative Fallacy → Bandwagon Effect ("compelling stories attract followers") - All individual biases → How Biases Compound - All individual biases → Bias in Tech Forecasting
Interactive Features: - Click any bias node to highlight all connected biases and their reinforcement edges - Hover over an edge to see the reinforcement mechanism described in a tooltip - Double-click a node to see a QC-specific example in a popup - Toggle: "Show amplification chains" — traces the longest reinforcement paths - Toggle: "Show by category" — colors nodes by category (commitment, information, social, knowledge) - Force-directed layout with drag-to-reposition
Visual Style: - Node size proportional to connection count (sunk cost, FOMO, groupthink largest) - Edge thickness proportional to reinforcement strength - Animated edge pulses when a reinforcement chain is highlighted - Node colors: red (strongest biases), orange (moderate), yellow (supporting), purple (structural), gray (outcome), blue (overview) - Background: aliceblue
Responsive Design: Graph scales with container; node labels adjust font size for readability.
Implementation: vis-network with custom interaction handlers
Part IV: Knowledge Illusion Biases
Dunning-Kruger Effect in Quantum Computing
The Dunning-Kruger effect describes a pattern where people with limited knowledge in a domain overestimate their competence, while experts tend to underestimate theirs. In quantum computing, this creates a specific problem: the people making investment decisions (VCs, corporate executives, government officials) typically lack the physics expertise to evaluate the technology, yet their limited exposure to quantum computing concepts gives them unwarranted confidence in their assessments.
The Dunning-Kruger dynamic in quantum computing follows a predictable pattern:
- An investor attends a quantum computing conference or reads a popular article
- They learn terms like "superposition," "entanglement," and "exponential speedup"
- They feel they understand the technology — enough to evaluate an investment
- They cannot assess whether error rates are fundamentally limited, whether the 1,000:1 overhead ratio can be reduced, or whether decoherence is an engineering or physics problem
- They invest based on narrative, analogy, and social proof rather than technical assessment
The experts — the physicists who understand the barriers — are often in positions where expressing skepticism is career-limiting. The result is that the most confident voices in the room are the least informed, and the most informed voices are either silent or ignored.
Optimism Bias
Optimism bias is the systematic tendency to overestimate the likelihood of positive outcomes and underestimate the likelihood of negative ones. In technology investment, optimism bias manifests as:
- Overestimating the probability that breakthroughs will occur on schedule
- Underestimating the difficulty of the remaining challenges
- Overweighting best-case scenarios in decision-making
- Underweighting base rates of technology failure
Research by Kahneman and Tversky established that optimism bias is strongest when:
- The outcome is personally important (high stakes)
- The person has some control or involvement (sense of agency)
- The timeframe is distant (future events feel more optimistic than near-term ones)
All three conditions apply to quantum computing investment: the stakes are high (billions of dollars), the investors feel involved (they've chosen the bet), and the payoff is distant (10-20+ years). This is the perfect storm for optimism bias.
Survivorship Bias
Survivorship bias occurs when conclusions are drawn from a non-representative sample that includes only "survivors" (successes) while ignoring the much larger pool of failures. In technology investment, survivorship bias manifests as:
- Citing transistors, lasers, and the internet as evidence that "bold physics bets pay off" — while ignoring the dozens of bold physics bets that failed (cold fusion, nanotechnology assemblers, superconducting computing in the 1980s)
- Pointing to Amazon and Google as evidence that bubble-era investments can succeed — while ignoring the thousands of dot-com companies that went bankrupt
- Celebrating D-Wave's continued existence as evidence of commercial viability — while ignoring that it has lost money every year for 25 years
The corrective for survivorship bias is always to ask: "What about the ones that didn't make it?" For every transistor success story, there are dozens of materials science investments that failed silently. The survivors tell us nothing about the base rate of success.
Key Insight
Survivorship bias is the quantum computing optimist's favorite tool. "The transistor was once considered impossible too!" Yes — and so were hundreds of other technologies that actually were impossible. The transistor's success tells us nothing about quantum computing's probability of success, just as a lottery winner's success tells us nothing about the expected value of buying a lottery ticket. You need the base rate — the failure rate — not the highlight reel.
Information Asymmetry
Information asymmetry occurs when one party in a transaction has materially more information than the other. In quantum computing, the asymmetry is stark:
- Physicists understand the fundamental barriers (decoherence, error rates, the 1,000:1 overhead) but often have career incentives to be optimistic
- Engineers understand the practical challenges (cryogenic scaling, wiring problems, calibration overhead) but are employed by companies with incentives to project progress
- Investors and policymakers typically lack the technical background to independently evaluate claims and must rely on the same experts who have conflicted incentives
This creates the classic "lemons problem" identified by George Akerlof: when sellers (proponents) know more about the product (quantum computing) than buyers (investors), the market cannot price the asset correctly. Investors cannot distinguish between genuine technical progress and misleading framing, so they rely on social signals (who else is investing, who endorsed the company) rather than technical assessment.
| Information Holder | What They Know | What They Don't Say | Why |
|---|---|---|---|
| QC researchers | Exact error rates, coherence limits, scaling barriers | "Our approach may be fundamentally limited" | Career risk, grant funding |
| QC company executives | True progress vs. marketing claims | "We're further from commercialization than our roadmap suggests" | Stock price, investor relations |
| Consulting firms | Their projections are assumption-driven | "Our $450B figure assumes breakthroughs that may never occur" | Client demand for optimism |
| Conference organizers | Skeptics are excluded or marginalized | "We curate panels to favor optimistic perspectives" | Attendee and sponsor expectations |
Part V: How Biases Compound
Bias Compounding: The Multiplication Problem
Individual cognitive biases are manageable — a trained thinker can recognize and compensate for any single bias. The danger in quantum computing investment is that multiple biases operate simultaneously and reinforce each other, creating a compound effect that is far more resistant to correction than any individual bias.
Consider a typical quantum computing investment decision. The investor encounters the following biases in sequence:
- Anchoring: They see McKinsey's $450B market projection (anchor set high)
- Authority bias: The projection comes from McKinsey — a prestigious firm (authority validates the anchor)
- Bandwagon effect: Google, IBM, and Microsoft are all investing (social proof reinforces the anchor)
- Confirmation bias: They read three positive articles for every skeptical one (confirming information dominates)
- Narrative fallacy: "Quantum computing is the next internet" — a compelling story that bypasses analysis
- FOMO: "If I don't invest now, I'll miss the opportunity" (urgency overrides deliberation)
- Optimism bias: "Sure, there are challenges, but they'll be solved" (future feels brighter than evidence warrants)
- Dunning-Kruger: "I understand enough about quantum mechanics to evaluate this" (overconfidence in limited knowledge)
Each bias alone might be overcome by careful thinking. But when eight biases align in the same direction, the cumulative force is nearly irresistible. The investor makes what feels like a well-reasoned decision — they consulted experts, read the research, evaluated the market — but every step of their reasoning was distorted by the same set of compounding biases.
Diagram: Bias Compounding Cascade Simulator
Cognitive Bias Compounding Cascade Simulator
Type: microsim
sim-id: bias-compounding-cascade
Library: p5.js
Status: Specified
Learning Objective: Demonstrate how multiple cognitive biases compound to distort a technology investment decision, showing the gap between rational assessment and bias-influenced assessment widens with each additional bias (Bloom's Level 3: Apply — demonstrate, use, execute).
Instructional Rationale: A step-through simulation with a visual "bias meter" is appropriate because the Apply/demonstrate objective requires learners to see the cumulative effect of each bias being added. The additive visual representation makes the compounding effect tangible and memorable.
Canvas Layout: - Top: Horizontal "rationality meter" showing estimated probability of QC success (0-100%) - Middle: Sequential bias cards (one per bias) that flip over one at a time - Bottom: Running comparison — "Rational estimate" vs "Bias-influenced estimate"
Interactive Controls: - Button: "Add Next Bias" — flips the next bias card and adjusts the meter - Button: "Remove Last Bias" — removes the most recently added bias - Button: "Reset" — returns to starting state - Slider: "Starting rational estimate" (1% to 50%, default 5%) — sets the base rate before biases - Toggle: "Show bias strength" — displays the magnitude of each bias's effect
Visual Elements: - Horizontal meter with two markers: blue (rational estimate, stays fixed) and red (bias-influenced, drifts upward) - Bias cards arranged in a horizontal row, face-down initially - Each card flips to reveal: bias name, icon, description, and effect magnitude - Running gap display: "Bias gap: X percentage points" - Background: aliceblue
Data Visibility Requirements: - Stage 0 (no biases): Show starting rational estimate (e.g., 5%) - Stage 1 (anchoring): Show how the McKinsey anchor shifts estimate upward (+15 pp → 20%) - Stage 2 (authority bias): Show McKinsey's authority adding credibility (+8 pp → 28%) - Stage 3 (bandwagon): Show "Google, IBM investing" adding pressure (+10 pp → 38%) - Stage 4 (confirmation bias): Show selective evidence filtering (+7 pp → 45%) - Stage 5 (narrative fallacy): Show "next internet" story adding appeal (+8 pp → 53%) - Stage 6 (FOMO): Show urgency override (+10 pp → 63%) - Stage 7 (optimism bias): Show "challenges will be solved" (+8 pp → 71%) - Stage 8 (Dunning-Kruger): Show false confidence in assessment (+7 pp → 78%) - Final display: "Rational estimate: 5%. Bias-influenced estimate: 78%. Bias gap: 73 percentage points."
Behavior: - Each bias adds to the red marker's position (with slight randomization for replayability) - The blue marker (rational estimate) never moves — it's the anchor for comparison - The gap between markers grows visually as biases accumulate - At the end, a summary panel shows: "An investor influenced by these 8 biases estimates a 78% chance of success. The evidence-based estimate is 5%."
Responsive Design: Cards wrap to new row on narrow screens; meter stays full-width.
Implementation: p5.js with DOM card elements and canvas meter
Bias in Technology Forecasting
The compounding of biases has a specific and measurable effect on technology forecasting: it makes forecasts systematically overoptimistic. The evidence for this is overwhelming:
- Philip Tetlock's research on expert political judgment found that experts' predictions were barely better than chance — and that the most confident experts were the least accurate
- The track record of quantum computing predictions (reviewed in Chapter 10) shows 0% accuracy on timeline estimates
- Consulting firm projections for quantum computing market size are based on assumptions that encode every bias we've discussed: anchored to theoretical potential, confirmed by selective evidence, narrated as inevitable, and socially validated by prestigious endorsement
The corrective for biased forecasting is a discipline called "reference class forecasting," which requires:
- Identify the reference class: What category of technologies is this most similar to? (For QC: technologies requiring multiple simultaneous physics breakthroughs)
- Determine the base rate: What fraction of technologies in this class succeeded? (Answer: 5-15%)
- Adjust from the base rate: Is there specific evidence that this case is better or worse than the base rate? (For QC: the evidence is mixed, but the compounding of required breakthroughs suggests worse than base rate)
- Resist narrative adjustment: Do not allow stories, analogies, or social pressure to override the statistical evidence
Bias Alert
The most dangerous bias in technology forecasting isn't any single bias — it's the meta-bias of believing you're unbiased. Every investor, researcher, and policymaker believes they are thinking clearly. The quantum computing industry is not sustained by stupid people making obvious mistakes. It is sustained by smart people making subtle, psychologically predictable errors — the same errors that inflated the dot-com bubble, the housing bubble, and every other speculative mania. Recognizing that you are not immune is the first step toward genuine rationality.
Diagram: Bias Detection Checklist Tool
Cognitive Bias Detection Checklist Tool
Type: microsim
sim-id: bias-detection-checklist
Library: p5.js
Status: Specified
Learning Objective: Apply a structured bias detection checklist to analyze a quantum computing press release or investor pitch, identifying embedded biases and rating the overall bias level (Bloom's Level 3: Apply — use, execute, implement).
Instructional Rationale: An interactive checklist with built-in examples is appropriate because the Apply/use objective requires learners to practice the skill of bias detection on realistic material. Pre-loaded examples provide immediate practice opportunities.
Canvas Layout: - Left panel (55%): Sample text display area (press release or pitch excerpt) - Right panel (45%): Bias checklist with checkboxes and scoring
Interactive Controls: - Dropdown: Select sample text ("IonQ press release," "McKinsey QC report excerpt," "Government funding announcement," "Skeptical analysis" — four pre-loaded examples) - 12 checkboxes (one per bias covered in this chapter, excluding overview and compounding) - Each checkbox has a brief prompt: e.g., "Sunk cost: Does it reference past investment as justification?" - "Calculate Bias Score" button - "Show expert analysis" toggle — reveals which biases an expert identified in the same text
Visual Elements: - Sample text with biased phrases highlighted in yellow after analysis - Bias score gauge: 0 (minimal bias) to 12 (maximum bias) - Color-coded result: green (0-3), yellow (4-6), orange (7-9), red (10-12) - Comparison panel: user's identified biases vs expert analysis - Background: aliceblue
Behavior: - User reads sample text and checks biases they identify - On "Calculate," the tool scores and reveals the expert analysis - Highlighted phrases in the text link to specific biases - Score message: "You identified X of Y biases. Expert identified Y. [Feedback message]" - Feedback adapts: "Excellent!" (within 2 of expert) / "Look more carefully at..." (missed 3+)
Responsive Design: Panels stack vertically on narrow screens; text area scrolls.
Implementation: p5.js with DOM elements for text display and checkboxes
Key Takeaways
The cognitive biases examined in this chapter are not incidental to quantum computing investment — they are the mechanism by which investment persists despite negative evidence. Without these biases, the rational assessment frameworks from Chapters 8-10 would lead to dramatically reduced investment:
- Sunk cost fallacy and motivated reasoning keep capital flowing after negative signals by making past spending feel irrecoverable and desired conclusions feel rational
- Confirmation bias, anchoring, and narrative fallacy distort how evidence is processed, ensuring that positive signals are amplified and negative signals are filtered out
- Bandwagon effect, authority bias, FOMO, and groupthink create social pressure that makes dissent costly and conformity rewarding
- Dunning-Kruger, optimism bias, and survivorship bias create unwarranted confidence by inflating the perception of understanding, overweighting positive outcomes, and hiding failures
- Information asymmetry is the structural condition that makes all other biases more powerful — when investors cannot independently verify claims, they must rely on biased sources
- Bias compounding is the key insight: individual biases are manageable, but when 8-12 biases reinforce each other simultaneously, the cumulative distortion transforms a 5% probability into a 70%+ perceived probability
The antidote is not intelligence — it is process. Structured decision-making, reference class forecasting, mandatory devil's advocacy, and the explicit identification of cognitive biases before making investment decisions can significantly reduce (though never eliminate) the influence of biased thinking.
Excellent Investigative Work!
You can now identify the psychological machinery that sustains quantum computing hype — and any other technology hype you'll encounter. Sunk cost fallacy, confirmation bias, FOMO, groupthink: these aren't abstract concepts anymore. They're tools in your investigative kit. The next time someone tells you quantum computing is "inevitable," you'll know exactly which biases are doing the talking. Outstanding work, fellow investigator!
Review Questions
Question 1: Why are intelligent, well-educated people still susceptible to cognitive biases?
Cognitive biases are not errors of intelligence — they are systematic patterns built into the architecture of human cognition. They arise from heuristics (mental shortcuts) that the brain evolved for efficiency, not accuracy. A brilliant physicist can fall prey to sunk cost fallacy because the bias operates below conscious reasoning. A seasoned investor can succumb to anchoring because the bias distorts the starting point for all subsequent analysis. Education and intelligence can help people recognize biases when prompted, but biases operate by default unless actively counteracted through structured decision processes.
Question 2: How does information asymmetry amplify other cognitive biases in quantum computing?
Information asymmetry means investors and policymakers cannot independently verify the technical claims made by quantum computing proponents. This amplifies other biases in several ways: (1) it makes authority bias more powerful, because non-experts must defer to authorities who may have conflicted incentives; (2) it enables the Dunning-Kruger effect, because superficial understanding feels like sufficient understanding when verification is impossible; (3) it strengthens confirmation bias, because investors cannot tell which positive signals are meaningful and which are marketing; and (4) it protects narratives from scrutiny, because the narrative fallacy is hardest to detect when the audience lacks the technical knowledge to challenge the story.
Question 3: Give a specific example of how anchoring bias operates in quantum computing investment.
When McKinsey projects a \(450-850 billion quantum computing market by 2035, this number becomes an anchor that influences all subsequent assessment. An investor who encounters this projection will unconsciously use it as a starting point, even if they later encounter skeptical analysis. Their "adjusted" estimate might be "\)100 billion — even if McKinsey is off by 80%." But the anchor itself is based on assumptions about physics breakthroughs that haven't occurred, meaning the correct starting anchor is closer to $0 in commercial revenue. The investor's "skeptical" $100 billion estimate is actually far more optimistic than the evidence supports, but it feels conservative because they adjusted downward from the $450 billion anchor. This is anchoring in action: the initial number dominates even when consciously adjusted.
Question 4: What is groupthink and how does it manifest in the quantum computing research community?
Groupthink is the tendency for cohesive groups to prioritize consensus over critical evaluation, suppressing dissent and discouraging exploration of alternatives. In the quantum computing community, groupthink manifests as: (1) an illusion of invulnerability ("quantum computing is inevitable"); (2) collective rationalization of slow progress ("we just need more time and funding"); (3) stereotyping of skeptics ("they don't understand quantum mechanics"); (4) self-censorship by researchers who privately doubt the timeline; (5) direct pressure on dissenters through conference exclusion, grant denial, or colleague criticism; and (6) the illusion of unanimity created by this suppression of dissent. The small size of the quantum computing community (researchers attend the same conferences and compete for the same grants) makes groupthink particularly effective.
Question 5: Explain the bias compounding effect using a step-by-step example.
Consider an investor evaluating a quantum computing startup. They begin with no strong prior. Step 1 (Anchoring): They see a consulting report projecting $450B market — their mental starting point shifts high. Step 2 (Authority bias): The report is from McKinsey — they trust it more. Step 3 (Bandwagon): They learn Google and IBM are investing heavily — social proof validates the high anchor. Step 4 (Confirmation bias): They read four positive articles and one skeptical one — the positive ones feel more credible. Step 5 (Narrative fallacy): "Quantum is the next internet" tells a compelling story that ties the pieces together. Step 6 (FOMO): They worry about missing the opportunity if they wait. Step 7 (Optimism bias): They assume the technical barriers will be overcome. Step 8 (Dunning-Kruger): Having read several articles, they feel competent to evaluate the investment. Each bias individually shifts their assessment by perhaps 5-15 percentage points. Compounded, an evidence-based 5% probability of success feels like 70-80%. The investor commits capital to what feels like a well-researched decision, but every step was distorted.