Skip to content

What Is a General Purpose Technology?

Summary

This chapter introduces the concept of General Purpose Technologies (GPTs) — technologies that transform entire economies by being broadly applicable, improvable over time, and enabling complementary innovations. We examine historical GPTs including the steam engine, electricity, the transistor, and the internet, and show how AI/ML is emerging as the next GPT. We then systematically demonstrate why quantum computing fails every test for being a General Purpose Technology: it is narrowly applicable, cannot replace classical computers, and lacks an ecosystem of complementary innovations.

Concepts Covered

This chapter covers the following 13 concepts from the learning graph:

  1. General Purpose Technology
  2. GPT Characteristics
  3. Must Be Broadly Applicable
  4. Must Improve Over Time
  5. Must Enable New Innovations
  6. Steam Engine as GPT
  7. Electricity as GPT
  8. Transistor as GPT
  9. Internet as GPT
  10. AI/ML as Emerging GPT
  11. QC Fails Every GPT Test
  12. QC Is Narrowly Applicable
  13. QC Cannot Replace Classical

Prerequisites

This chapter builds on concepts from:


Fermi Welcomes You!

Fermi welcomes you Welcome, fellow investigators! This chapter shifts from physics to economics. Quantum computing proponents often imply that it will be a transformative technology on par with electricity or the internet. To evaluate that claim, we need a rigorous framework for what makes a technology truly transformative. The economists have one — it is called General Purpose Technology theory. But does the math check out? Let's find out!

Learning Objectives

After completing this chapter, you will be able to:

  • Define a General Purpose Technology and state its three essential characteristics
  • Identify historical GPTs and explain why each qualifies using the three-criteria framework
  • Evaluate whether AI/ML meets the criteria for an emerging GPT
  • Demonstrate that quantum computing fails all three GPT criteria with specific evidence
  • Distinguish between technologies that are scientifically impressive and technologies that are economically transformative
  • Apply the GPT framework to assess emerging technology investment narratives

What Is a General Purpose Technology?

The concept of General Purpose Technologies was formalized by economists Timothy Bresnahan and Manuel Trajtenberg in their 1995 paper "General Purpose Technologies: Engines of Growth?" The theory provides a rigorous framework for identifying the handful of technologies in human history that have fundamentally restructured economies and societies.

A GPT is not merely an important technology. Many technologies are important without being general-purpose. Nuclear power is important. MRI machines are important. Satellite navigation is important. But none of these individually transformed the structure of the economy the way that a true GPT does.

The GPT framework matters for our analysis because quantum computing proponents — implicitly or explicitly — claim that quantum computing will be transformative on the scale of the transistor or the internet. If that claim is false, then the investment thesis collapses: a narrow, specialized technology does not justify the $100+ billion in investment that quantum computing has attracted.


The Three GPT Characteristics

Bresnahan and Trajtenberg identified three characteristics that a technology must exhibit to qualify as a GPT. All three are necessary — a technology that satisfies only one or two is not a GPT.

1. Must Be Broadly Applicable

A GPT must be usable across a wide range of industries, sectors, and applications — not confined to a single domain. It must be a platform technology that many different users can build upon for many different purposes.

What "broadly applicable" means:

  • The technology is used in manufacturing, services, agriculture, transportation, communication, entertainment, healthcare, finance, and government
  • New users discover applications that the inventors never anticipated
  • The technology becomes part of the basic infrastructure of the economy

What "broadly applicable" does NOT mean:

  • A technology used by many companies in the same narrow domain (e.g., many pharmaceutical companies using the same type of spectrometer)
  • A technology available to many people but useful for only one purpose

2. Must Improve Over Time

A GPT must exhibit sustained improvement in performance, cost, or capability over extended periods — decades, not years. This sustained improvement is what makes the technology increasingly valuable and drives adoption across new sectors.

What "improve over time" means:

  • Performance increases or cost decreases by orders of magnitude over the technology's lifecycle
  • Improvements are driven by both incremental refinement and fundamental advances
  • The technology's trajectory of improvement is self-sustaining: improvements in one component drive improvements in others

3. Must Enable New Innovations

A GPT must create an ecosystem of complementary innovations — new products, services, industries, and even new technologies that are made possible by the GPT but were not part of the original invention. This characteristic is sometimes called "innovation spawning" and is the most important indicator of a true GPT.

What "enable new innovations" means:

  • Entirely new industries emerge that could not exist without the GPT
  • The GPT creates a platform on which entrepreneurs, researchers, and businesses build novel applications
  • The complementary innovations are more economically valuable, in aggregate, than the GPT itself

Fermi's Tip

Fermi shares a tip When evaluating any technology investment thesis, test it against these three criteria. If the technology is narrowly applicable, has plateauing improvement, or does not spawn a complementary innovation ecosystem, it may be a valuable specialty tool — but it will not transform the economy, and investment predicated on economic transformation will not pay off.


Historical GPTs

Only a handful of technologies in human history have qualified as GPTs. Examining these successes builds the framework we need to evaluate quantum computing.

The Steam Engine as GPT

The steam engine, perfected by James Watt in the 1760s-1780s, is the canonical example of a GPT.

GPT Criterion How the Steam Engine Qualifies
Broadly applicable Powered factories, mines, railroads, ships, water pumping, agriculture. Used in virtually every sector of the economy.
Improved over time Watt's engine was 4x more efficient than Newcomen's. By 1900, steam turbines were 100x more efficient. Continuous improvement over 150+ years.
Enabled new innovations Railroads (new industry), factory system (new production model), urbanization (new social structure), global shipping (new trade patterns). The complementary innovations reshaped civilization.

Electricity as GPT

Electrification, beginning in the 1880s with Edison's power stations and Tesla/Westinghouse's AC systems, transformed every aspect of economic life.

GPT Criterion How Electricity Qualifies
Broadly applicable Powers lighting, heating, cooling, manufacturing, transportation, communication, computation, and virtually every modern device.
Improved over time Generation efficiency improved from ~5% (1880s) to ~60% (modern combined-cycle gas turbines). Distribution losses dropped steadily. Cost per kWh fell by orders of magnitude.
Enabled new innovations Radio, television, telecommunications, computing, the internet (all require electricity). Refrigeration (transformed food supply chains). Electric motors (transformed manufacturing). The complementary innovation ecosystem is almost unbounded.

The Transistor as GPT

The transistor, invented at Bell Labs in 1947, is the most economically consequential invention of the twentieth century.

GPT Criterion How the Transistor Qualifies
Broadly applicable Used in every electronic device: computers, phones, cars, medical equipment, industrial control, consumer electronics, military systems. Estimated 10 sextillion (\(10^{22}\)) transistors manufactured as of 2025.
Improved over time Moore's Law: transistor density doubled approximately every 2 years for 60+ years. Cost per transistor fell from ~\(1 (1960s) to ~\)10^{-10}$ (2020s). A trillion-fold improvement.
Enabled new innovations The personal computer, the smartphone, the internet, cloud computing, artificial intelligence, GPS, digital photography, electronic commerce, social media. Each of these is a multi-trillion-dollar industry that could not exist without the transistor.

The Internet as GPT

The internet, evolving from ARPANET (1969) through the World Wide Web (1991) to the modern cloud, is the most recent fully established GPT.

GPT Criterion How the Internet Qualifies
Broadly applicable Used by individuals, businesses, governments, schools, hospitals, farms, factories. Over 5 billion users worldwide. Penetrates every sector of every economy.
Improved over time Bandwidth increased from 300 baud (1960s) to multi-gigabit (2020s) — a million-fold improvement. Latency decreased from seconds to milliseconds. Cost per bit transmitted fell by orders of magnitude.
Enabled new innovations E-commerce (Amazon, Alibaba), social media (Meta, TikTok), cloud computing (AWS, Azure), streaming entertainment (Netflix, Spotify), the gig economy (Uber, DoorDash), remote work, telemedicine, online education. Each represents an industry that did not exist before the internet.

Diagram: GPT Qualification Scorecard

GPT Qualification Scorecard

Type: infographic sim-id: gpt-scorecard
Library: p5.js
Status: Specified

Bloom Level: Evaluate (L5) Bloom Verb: Assess, Compare

Learning Objective: Students will be able to assess multiple technologies against the three GPT criteria and compare their qualification profiles, understanding why some technologies are truly transformative while others are not.

Instructional Rationale: An interactive scorecard enables students to evaluate each technology against standardized criteria, building the analytical habit of systematic assessment rather than impressionistic judgment. The side-by-side comparison makes the contrast between genuine GPTs and quantum computing visually stark.

Canvas layout:

  • Full width: A matrix with technologies as rows and GPT criteria as columns

Visual elements:

  • A grid/matrix with 7 rows (Steam Engine, Electricity, Transistor, Internet, AI/ML, Quantum Computing, [blank for student entry]) and 3 columns (Broadly Applicable, Improves Over Time, Enables New Innovations)
  • Each cell contains a score (0-10) represented as a filled bar
  • Color coding: Green (8-10), Yellow (4-7), Red (0-3)
  • Row totals on the right side
  • A "GPT Threshold" line at a combined score of 24/30 (approximate)

Data:

Technology Broadly Applicable Improves Over Time Enables Innovations Total
Steam Engine 9 8 10 27
Electricity 10 9 10 29
Transistor 10 10 10 30
Internet 10 9 10 29
AI/ML 9 9 8 26
Quantum Computing 2 3 1 6

Interactive features:

  • Hover over any cell to see: detailed justification for the score, specific examples, and counterarguments considered
  • Click on the blank row to open an input mode where students can evaluate a technology of their choice
  • Toggle: "Show comparison overlay" — overlays the QC row on each historical GPT row to visualize the gap
  • Toggle: "Show evidence" — expands each cell to show bullet-point evidence for the score

Implementation: p5.js. Background: aliceblue. Responsive to window resize.


AI/ML as an Emerging GPT

Before evaluating quantum computing, it is instructive to examine a technology that is emerging as a GPT: artificial intelligence and machine learning. The contrast with quantum computing is revealing.

AI/ML Meets the GPT Criteria

GPT Criterion How AI/ML Qualifies Current Evidence
Broadly applicable Used in healthcare (diagnostics, drug discovery), finance (fraud detection, trading), transportation (autonomous vehicles), manufacturing (quality control, predictive maintenance), agriculture (crop monitoring), entertainment (recommendation systems), education (tutoring), and virtually every other sector Deployed at scale in thousands of companies across every major industry
Improves over time Model capabilities have improved exponentially: GPT-2 (2019, 1.5B parameters) → GPT-4 (2023, ~1.8T parameters). Training cost per unit of capability drops rapidly. New architectures (transformers, diffusion models) emerge regularly. Performance on benchmarks has improved by 10-100x in 5 years; improvement trajectory shows no sign of plateauing
Enables new innovations AI-generated content (text, images, video, code), autonomous systems, personalized medicine, real-time language translation, scientific discovery assistance, new forms of human-computer interaction. Many of these applications could not exist without modern ML. Multi-billion-dollar industries are forming around AI applications; AI is becoming embedded infrastructure for other innovations

The key observation is that AI/ML generates revenue today. Companies build products that customers pay for. The technology creates measurable economic value across diverse sectors. This is not speculative — it is demonstrated in quarterly earnings reports, product launches, and enterprise adoption metrics.

Key Insight

Fermi is thinking Compare the trajectories. AI/ML in 2024 has millions of paying customers, generates hundreds of billions in revenue, and is deployed in every major industry. Quantum computing in 2025 has zero paying customers for quantum advantage, generates near-zero product revenue, and has not solved a single commercial problem. Both have received enormous investment. Only one has produced returns. This is the difference between a technology that meets GPT criteria and one that does not.


QC Fails Every GPT Test

We can now apply the GPT framework systematically to quantum computing. The result is unambiguous: quantum computing fails all three criteria.

Criterion 1: Broadly Applicable — FAIL

A GPT must be usable across a wide range of industries and applications. Quantum computing is not.

The narrow problem set:

As established in Chapter 2, the number of problems with known quantum speedups is remarkably small:

  • Integer factoring and discrete logarithms (Shor's algorithm)
  • Unstructured search (Grover's algorithm — quadratic speedup only)
  • Quantum system simulation
  • Certain linear algebra problems (HHL — with severe caveats)
  • A handful of specialized optimization variants (unproven advantage)

This represents a vanishingly small fraction of all computational tasks. The vast majority of computing — web serving, databases, machine learning, video rendering, word processing, scientific simulation of classical systems, financial transactions, logistics, manufacturing control — receives zero benefit from quantum computing.

Comparison with genuine GPTs:

Technology Fraction of Economy Affected Example Non-Obvious Applications
Electricity >95% of all economic activity Refrigeration, radio, elevators
Transistor >90% of all economic activity Credit cards, traffic lights, hearing aids
Internet >80% of all economic activity Ride-sharing, online dating, telemedicine
Quantum Computing <0.1% of all computation None demonstrated

Criterion 2: Improves Over Time — QUESTIONABLE

A GPT must exhibit sustained improvement over decades. Quantum computing's improvement trajectory is, at best, uncertain.

Qubit count has increased, roughly doubling every 1-2 years. But qubit count alone is not a meaningful metric — what matters is useful computational capacity, which is qubit count × gate fidelity × connectivity × coherence time. On this composite metric, improvement has been modest:

  • 2019: ~53 noisy qubits, error rate \(\sim 10^{-3}\), nearest-neighbor connectivity
  • 2025: ~1,200 noisy qubits, error rate \(\sim 10^{-3}\), nearest-neighbor connectivity

The qubit count increased 20x. The error rate improved only marginally. The useful computational capacity — the ability to run deeper circuits and solve harder problems — has not improved by anything close to what a GPT trajectory requires.

Comparison with genuine GPT improvement trajectories:

Technology Improvement Factor Over 30 Years Metric
Transistor ~1,000,000x Transistors per chip
Internet ~1,000,000x Bandwidth per user
AI/ML ~100,000x Model parameters / capability
Quantum Computing ~20x Qubit count (but ~1x in useful capacity)

Criterion 3: Enables New Innovations — FAIL

A GPT must spawn an ecosystem of complementary innovations — new products, industries, and technologies that could not exist without it. Quantum computing has spawned essentially nothing.

What complementary innovations has quantum computing enabled?

  • No new industries have been created by quantum computing
  • No new products depend on quantum computing for their functionality
  • No consumer, enterprise, or government application is built on quantum computing
  • The "quantum software" ecosystem (Qiskit, Cirq, PennyLane) exists to program quantum hardware that does not yet solve useful problems — it is tooling for a capability that does not exist, not a complementary innovation

Comparison with genuine GPT innovation ecosystems:

GPT Major Complementary Innovations
Steam Engine Railroads, factory system, steamships, coal mining industry
Electricity Radio, television, refrigeration, electric motors, computing
Transistor Personal computers, smartphones, GPS, digital cameras, internet
Internet E-commerce, social media, cloud computing, streaming, gig economy
AI/ML AI-generated content, autonomous systems, personalized medicine, code assistants
Quantum Computing None demonstrated

Bias Alert

Fermi warns you Proponents sometimes argue that quantum computing is "too early" to judge by GPT criteria — that we are in the equivalent of the 1880s for electricity or the 1950s for transistors. This analogy fails on the evidence. By the time the transistor was the same age as quantum computing is today (~30 years from first concept to assessment), it was already powering radios, telephone switches, and early computers, and the complementary innovation ecosystem was clearly emerging. Quantum computing, at a comparable stage, has not powered a single commercial application. The "it's too early" defense is unfalsifiable — and unfalsifiable claims should raise your skepticism, not lower it.


QC Is Narrowly Applicable

Let us quantify the narrowness of quantum computing's applicability by examining what fraction of real-world computational workloads could benefit from quantum speedups — even in the theoretical best case.

Categories of Computation

Category Fraction of Global Compute QC Benefit? Reason
Web serving & cloud infrastructure ~30% No I/O-bound; no quantum algorithm helps
AI/ML training & inference ~25% No GPU/TPU-optimized; no demonstrated quantum advantage
Databases & transaction processing ~15% No Sequential consistency requirements; no quantum algorithm helps
Scientific simulation (classical physics) ~10% No Simulating classical systems; no quantum advantage
Media processing (video, audio, images) ~8% No Embarrassingly parallel; classical hardware optimal
Business applications (ERP, CRM, email) ~7% No Not compute-bound; no quantum algorithm helps
Cryptanalysis ~1% Theoretically yes Shor's algorithm — but requires hardware that doesn't exist
Quantum system simulation ~0.1% Theoretically yes Feynman's original vision — but limited to quantum-specific problems
Certain optimization variants ~0.1% Possibly Unproven; classical often matches or exceeds
Total with potential QC benefit ~1.2% And even this requires hardware that does not yet exist

Even under the most generous assumptions, quantum computing could theoretically benefit approximately 1% of global computation. And that 1% requires hardware breakthroughs that may never materialize. A technology applicable to 1% of computation — even if it achieved dramatic speedups for that 1% — is not a General Purpose Technology. It is a specialty tool.

Diagram: Computational Workload Distribution

Global Computational Workload Distribution

Type: chart sim-id: workload-distribution
Library: Chart.js
Status: Specified

Bloom Level: Analyze (L4) Bloom Verb: Differentiate, Examine

Learning Objective: Students will be able to differentiate between computational workloads that could benefit from quantum computing and those that cannot, and examine why quantum computing's applicability is limited to a tiny fraction of global computation.

Instructional Rationale: A pie/donut chart with interactive drill-down is appropriate because the Analyze/differentiate objective requires students to see the proportional distribution of computational workloads and identify the tiny slice where quantum computing might apply. The visual impact of a sliver versus the whole pie is more persuasive than text alone.

Chart type: Donut chart with interactive segments

Data:

  • Web serving & cloud: 30% (gray)
  • AI/ML: 25% (blue)
  • Databases & transactions: 15% (green)
  • Scientific simulation (classical): 10% (teal)
  • Media processing: 8% (purple)
  • Business applications: 7% (orange)
  • Other classical: 3.8% (light gray)
  • Potential QC benefit (all categories): 1.2% (red — the key segment)

Visual elements:

  • Main donut chart showing the distribution
  • The 1.2% QC-applicable segment is pulled out and highlighted with a red glow
  • An annotation arrow pointing to the tiny segment: "Everything quantum computing could theoretically help with — and even this requires breakthroughs that may never happen"
  • A sub-donut expanding the 1.2% segment into its components (cryptanalysis, quantum simulation, optimization)

Interactive features:

  • Hover over any segment to see: category name, percentage, typical workloads, why QC helps or doesn't help
  • Click on the QC segment to expand the sub-donut showing the breakdown of potentially QC-applicable workloads
  • Toggle: "Show classical alternatives" — for each QC-applicable workload, show what classical method currently handles it and its performance

Implementation: Chart.js donut chart. Background: aliceblue. Responsive to window resize.


QC Cannot Replace Classical Computing

The final element of the GPT analysis is the most straightforward: quantum computers cannot replace classical computers for general-purpose computation. This is not a temporary limitation — it is a consequence of what quantum computers are.

Why Replacement Is Impossible

  1. Measurement destroys quantum states. Every computation must end with measurement, which collapses quantum information to classical bits. Quantum computers output classical data. They do not maintain persistent quantum state the way classical computers maintain persistent memory.

  2. The no-cloning theorem prevents copying. Classical computing depends fundamentally on the ability to copy data — for backup, for distribution, for display. Quantum states cannot be copied. This alone prevents quantum computers from performing most classical computing tasks.

  3. Error rates make general computation infeasible. Classical transistors operate with error rates of \(\sim 10^{-18}\). Quantum gates operate at \(\sim 10^{-3}\). For tasks that require high precision and long computation (which is most tasks), quantum computers are not merely slower — they produce incorrect results.

  4. I/O is fundamentally limited. Loading classical data into a quantum computer (state preparation) and extracting results (measurement) are bottlenecks. Most real-world tasks are data-intensive, making the I/O limitation disqualifying.

  5. Cost and infrastructure are prohibitive. Even if a quantum computer could theoretically perform a general-purpose task, the cost — cryogenics, specialized facilities, error correction overhead — would make it thousands to millions of times more expensive than a classical computer for the same task.

What Quantum Computers Are

Quantum computers, if they ever achieve fault tolerance at scale, would be co-processors — specialized accelerators called upon by classical computers for specific subproblems, analogous to how GPUs accelerate matrix operations or how FPGAs accelerate specific signal processing tasks.

Role Historical Analogy Implication
General-purpose computer CPU, classical computer Quantum computers cannot fill this role
Specialized co-processor GPU, FPGA, DSP, TPU Quantum computers could potentially fill this role — for a narrow set of problems
Infrastructure replacement Cloud, data center Quantum computers cannot replace classical infrastructure

A co-processor is valuable — GPUs generate billions in revenue. But a co-processor for <1% of computation, requiring billion-dollar infrastructure that doesn't yet work, is not a General Purpose Technology. It is a speculative niche product.

Key Insight

Fermi is thinking The GPT framing reveals the core problem with the quantum computing investment thesis. The thesis assumes quantum computing will be transformative — that it will reshape industries and generate returns comparable to the transistor or the internet. But the GPT analysis shows that quantum computing is, at best, a specialized co-processor for a tiny fraction of computation. That may still be valuable, but the scale of investment ($100+ billion) is calibrated to a GPT-level return, not a co-processor-level return. The investment is mispriced relative to the technology's actual potential.


Key Takeaways

This chapter applied the General Purpose Technology framework to quantum computing and found a decisive mismatch between the investment thesis and the technology's characteristics:

  1. General Purpose Technologies are defined by three criteria: broad applicability, sustained improvement, and complementary innovation spawning. Only a handful of technologies in history — the steam engine, electricity, the transistor, the internet — have qualified.

  2. The three criteria are necessary, not optional. A technology that satisfies only one or two criteria is not a GPT. Nuclear power is powerful but not broadly applicable. The Concorde was impressive but did not improve economically over time. Neither was a GPT.

  3. AI/ML is emerging as a GPT — it is broadly applicable (deployed across every major industry), improving rapidly (exponential capability gains), and spawning complementary innovations (AI-generated content, autonomous systems, scientific discovery tools).

  4. Quantum computing fails all three GPT criteria:

    • Not broadly applicable: Theoretically applicable to <1% of global computation, limited to problems with specific mathematical structure.
    • Questionable improvement trajectory: Qubit counts are increasing, but useful computational capacity has improved only marginally. Error rates have barely budged.
    • No complementary innovations: Zero new industries, products, or applications depend on quantum computing.
  5. Quantum computing is, at best, a specialized co-processor — analogous to a GPU or FPGA, not to electricity or the internet. Co-processors can be valuable, but they do not justify GPT-level investment.

  6. The investment is mispriced. Over $100 billion has been invested on the implicit assumption that quantum computing will be transformative at GPT scale. The GPT analysis shows this assumption is unsupported. The potential returns are co-processor-scale, not GPT-scale.

Excellent Investigative Work!

Fermi celebrates You now have a rigorous economic framework — General Purpose Technology theory — to evaluate whether a technology will be truly transformative or merely interesting. You have applied it to quantum computing and found that it fails all three criteria. The next time someone compares quantum computing to the invention of the transistor or the internet, you can explain precisely why that analogy does not hold. That is the kind of analytical thinking that separates informed skepticism from uninformed hype. Outstanding work, fellow investigator!


Review Questions

1. State the three criteria for a General Purpose Technology and give an example of a technology that satisfies all three.

The three criteria are: (1) Broadly applicable — used across a wide range of industries and sectors. (2) Improves over time — exhibits sustained improvement in performance or cost over decades. (3) Enables complementary innovations — spawns new products, industries, and technologies that could not exist without it. The transistor satisfies all three: it is used in every electronic device across every industry (broad applicability), transistor density doubled every ~2 years for 60+ years while cost fell a trillion-fold (improvement), and it enabled personal computers, smartphones, the internet, GPS, digital photography, and AI (complementary innovations).

2. Why does quantum computing fail the 'broadly applicable' criterion? Be specific about the fraction of computation affected.

Quantum algorithms provide speedups only for problems with specific mathematical structure — integer factoring (Shor's), unstructured search (Grover's), quantum simulation, and certain linear algebra problems. This represents approximately 1% of global computational workloads. The remaining ~99% — web serving, databases, AI/ML, media processing, business applications, classical scientific simulation — receives zero benefit from quantum computing. A technology applicable to 1% of computation is a specialty tool, not a broadly applicable GPT.

3. Compare the improvement trajectories of the transistor and quantum computing over their first 30 years.

The transistor (1947-1977): Transistor count per chip grew from 1 to ~25,000 (Intel 8086, 1978), a 25,000x increase. Cost per transistor fell by roughly 10,000x. The technology was powering radios, computers, telephone switches, and calculators — generating billions in revenue. Quantum computing (1994-2024): Qubit count grew from 2 to ~1,100, a ~550x increase. Error rates improved by roughly 10x (from \(10^{-2}\) to \(10^{-3}\)). The technology has not solved a single commercial problem. The transistor's improvement trajectory was ~50x faster, and it was already generating economic returns at a comparable stage.

4. What complementary innovations has quantum computing spawned? Compare this to the internet.

Quantum computing has spawned zero complementary innovations. No new industries, products, or applications depend on quantum computing for their functionality. The "quantum software" ecosystem (Qiskit, Cirq) is tooling for hardware that doesn't yet solve useful problems, not a complementary innovation. By contrast, the internet spawned e-commerce, social media, cloud computing, streaming entertainment, the gig economy, remote work, telemedicine, and online education — each representing multi-billion-dollar industries that could not exist without the internet.

5. A proponent argues that quantum computing is 'too early to judge' by GPT criteria — we are in the equivalent of 1885 for electricity. Why does this analogy fail?

By 1885 (only 3 years after Edison's first power station), electricity was already powering streetlights, small factories, and the first electric trolley systems. Within a decade, it powered elevators, enabling skyscrapers. The complementary innovation ecosystem was visibly emerging. Quantum computing, at a comparable stage (~30 years from first theoretical proposals), has not powered a single commercial application. Moreover, the "too early" argument is unfalsifiable — it can be deployed indefinitely, no matter how long the technology fails to produce results. An unfalsifiable defense of an investment thesis should increase skepticism, not decrease it.