Glossary of Terms
A comprehensive reference for the terminology used throughout this textbook. All definitions are precise, concise, and entirely serious.
Adaptive Learning
An educational approach in which content, pace, and difficulty adjust to individual student needs in real time, a concept that every teacher has practiced intuitively since Socrates and every edtech company has claimed to have invented since 2010. Adaptive learning systems use algorithms to personalize instruction, achieving approximately 12% of what a good teacher does through eye contact and the phrase "I can see you're confused."
Example: The adaptive learning platform detected that the student was struggling with fractions and responded by presenting fractions in a different color. The student continued to struggle with fractions, but now in blue.
AGI Timeline Claims
Predictions regarding when artificial general intelligence — AI that matches or exceeds human cognitive abilities across all domains — will be achieved. AGI timeline claims range from "within five years" to "never," and every point on this spectrum is asserted with equal confidence and supported by the same amount of evidence, which is none. The median AGI prediction has been "20 years from now" since 1956.
Example: In 1965, Herbert Simon predicted human-level AI within 20 years. In 2025, various experts predict human-level AI within 20 years. The consistency is remarkable.
AI Capabilities
The things an artificial intelligence system can actually do, as distinct from the things its press release claims it can do (see: AI Demo vs Product). Documenting AI capabilities requires specifying the conditions under which performance was measured, a practice observed by researchers and ignored by everyone else. Current AI capabilities are genuinely impressive. They are also genuinely not what the marketing says.
Example: Capability (real): "Classifies images of cats with 97.3% accuracy." Capability (press release): "Sees and understands the visual world like a human."
AI Demo Magic
The set of techniques used to make an AI system appear more capable in a demonstration than it is in deployment, including but not limited to: cherry-picked inputs, controlled environments, edited outputs, and the strategic omission of the seventeen attempts that failed before the one shown on stage. AI demo magic is not fraud. It is "curated presentation." The distinction matters to lawyers.
Example: On stage: the AI generates a working app from a sketch in 30 seconds. In production: the AI generates an app that almost works from a detailed specification in 30 hours, after three rounds of debugging by the same engineer who could have written it faster.
AI Demo vs Product
The critical distinction between a carefully curated demonstration of an AI system's capabilities (which is compelling, polished, and performed under controlled conditions) and the actual product (which crashes when the user's name contains an apostrophe). Approximately 93.4% of AI investment decisions are made based on demos. Approximately 93.4% of AI disappointments occur when the product ships.
Example: The demo translated Shakespeare into Mandarin in real time. The product could not reliably distinguish "their," "there," and "they're."
AI Hallucination
The phenomenon in which an AI system generates confident, fluent, and entirely fabricated information, indistinguishable in tone from its accurate outputs. The term "hallucination" implies the AI is perceiving something unreal, when in fact it is doing exactly what it was trained to do: produce plausible text. The AI is not hallucinating. You are anthropomorphizing.
Example: An AI confidently cited a Supreme Court case that does not exist. A lawyer submitted it to a judge. The judge existed. The consequences existed. The case did not.
AI Hype Culture
The social and economic ecosystem that transforms incremental technical progress into breathless proclamations of civilizational transformation, sustained by a feedback loop of press releases, keynote speeches, Twitter threads, and venture capital that rewards volume of claims over accuracy of claims. AI hype culture has a GDP larger than several countries and produces approximately the same amount of useful output as a decorative fountain.
Example: A research paper demonstrating a 2.3% improvement in image classification accuracy becomes, through the AI hype pipeline: "AI Now Sees Better Than Humans — Is Your Job Next?"
AI in Education
The application of artificial intelligence to teaching and learning, a domain in which AI's potential is described as "transformative" by vendors, "concerning" by teachers, and "inevitable" by everyone who has noticed that students have been using ChatGPT since November 2022 regardless of institutional policy. AI in education is either the future of learning or the end of learning, depending on whether the speaker sells AI software or assigns five-paragraph essays.
Example: A school district banned AI tools, then formed a committee to study AI in education. The committee used AI to draft its report. Nobody mentioned this.
AI Limitations
The things an artificial intelligence system cannot do, which constitute a longer and more important list than the things it can do, but which occupy approximately 0.3% of the average press release. AI limitations include but are not limited to: understanding context, exercising judgment, knowing when it is wrong, caring about being wrong, and operating reliably outside the distribution of its training data. Further research is needed to determine when this list will shrink. It is currently growing.
Example: An AI can beat the world champion at Go. It cannot reliably determine whether a photograph of a muffin is a muffin or a chihuahua. Both statements are true simultaneously.
AI Literacy
The ability to understand what AI systems do, how they work, what they cannot do, and why the press release says otherwise. AI literacy is the most important skill that is not being taught in most schools, a distinction it shares with financial literacy, media literacy, and the ability to change a tire. Of these, AI literacy will become relevant soonest, because the AI is coming for the accountant, the journalist, and eventually the tire.
Example: An AI-literate person reads "Our AI understands your intent" and translates it to "Our AI performs pattern matching on your input." This translation is worth approximately $40,000 in avoided vendor contracts.
AI Training Data
The corpus of text, images, or other information used to train an AI model, sourced from the internet, books, and whatever else could be scraped before the lawsuits were filed. Training data determines what the model knows, what biases it inherits, and which copyright holders will eventually seek damages. The quality of AI output is bounded by the quality of its training data, which is to say, bounded by the internet.
Example: A model trained on the internet knows everything humanity has published, including the parts humanity wishes it hadn't.
AI Tutoring System
A software system that provides individualized instruction through conversational AI, positioned as a solution to the global teacher shortage by people who have never spent eight hours in a room with thirty 12-year-olds. AI tutoring systems are patient, consistent, available 24/7, and unable to determine whether a student is crying. These are presented as features.
Example: The AI tutor explained photosynthesis seventeen different ways. The student needed someone to notice they hadn't eaten breakfast.
AI Washing
The practice of describing a product, service, or company as "AI-powered" when the underlying technology is substantially less sophisticated than the label implies, ranging from "we use a simple regression model" to "an intern manually reviews the outputs" to "we have a roadmap that includes AI at some point." AI washing is the technological equivalent of labeling tap water as "artisanal hydration."
Example: An "AI-powered" hiring platform that uses keyword matching and a single if-else statement. The if-else statement is technically a decision tree. A decision tree is technically machine learning. The lawyers have confirmed this is defensible.
AI-Generated Textbook
A textbook produced in whole or in part by artificial intelligence, raising philosophical questions about authorship, quality, and the recursive absurdity of using AI to write a textbook that satirizes AI. This textbook is an AI-generated textbook about AI-generated content about mythical beasts that are metaphors for AI. If this sentence made your head hurt, the pedagogy is working.
Example: You are reading one. The author used AI to write a book criticizing AI-generated content. This is either hypocrisy or performance art. The literature is divided.
Algorithm
A finite sequence of well-defined instructions for solving a problem, distinguishable from a recipe primarily by the absence of butter and the presence of Big O notation. In public discourse, "algorithm" has come to mean "the mysterious force that decides what I see online," which is technically inaccurate but emotionally precise.
Example: An algorithm for making toast: insert bread, press lever, wait. An algorithm for content recommendation: harvest behavioral data, model engagement patterns, optimize for metrics that correlate with compulsive usage, testify before Congress.
Allegorical Interpretation
The practice of reading a narrative on two levels simultaneously: the literal (a dragon attacks a village) and the symbolic (a large language model disrupts an industry). Readers who only perceive the literal level will find this textbook confusing. Readers who perceive both levels will find it uncomfortable. Both reactions are correct.
Example: When the ostrich in Chapter 6 refuses to acknowledge the approaching dragon, this is an allegory. It is also a staff meeting.
Allegory
A narrative in which characters, events, and settings represent abstract concepts, allowing the author to critique powerful institutions while maintaining plausible deniability. When a dragon in this textbook destroys a village, it is a dragon destroying a village. Any resemblance to a large language model disrupting an industry is coincidental and legally unprovable.
Example: George Orwell wrote about a farm. This textbook writes about unicorns. Both are about exactly what they are about, and also about something else entirely.
Ancient Unicorn Mythology
The body of pre-classical literature describing single-horned equines, dating to approximately 2700 BCE in the Indus Valley civilization. These early accounts describe the unicorn with the same level of empirical rigor later applied to cryptocurrency whitepapers, establishing a tradition of enthusiastic documentation of unverifiable phenomena that persists to the present day.
Example: Ctesias of Cnidus described unicorns in 400 BCE based on secondhand accounts from Persia, making him history's first tech journalist.
Appeal to Novelty
A logical fallacy in which a thing is assumed to be superior simply because it is new, without evidence that newness correlates with quality, effectiveness, or desirability. Appeal to novelty is the foundational fallacy of the technology industry and the reason every keynote contains the phrase "for the first time ever," a phrase that means "we did it recently" and implies "therefore it is good." These are different claims. They are treated as identical.
Example: "This is the first AI that can generate 3D models from text." This is interesting. "Therefore you should restructure your entire design pipeline around it." This does not follow. It will be funded anyway.
Artificial Intelligence
A field of computer science dedicated to creating machines that can perform tasks normally requiring human intelligence, or at minimum, tasks that can be described as requiring human intelligence in a press release. The definition of "intelligence" shifts conveniently to exclude whatever machines have most recently learned to do. AI has been "about to change everything" for approximately 70 years.
Example: In 1966, a professor assigned a student to "solve computer vision" as a summer project. The problem remains unsolved, though we have made significant progress in generating pictures of cats wearing hats.
Augmentation vs Replacement
The central debate in workforce AI adoption, in which "augmentation" means "AI will make you better at your job" and "replacement" means "AI will become your job," and the distinction between these outcomes is controlled by whoever writes the next quarterly earnings guidance. Augmentation is what companies promise. Replacement is what companies measure. The data is unambiguous about which one CFOs prefer.
Example: Phase 1: "AI will augment our team." Phase 2: "AI has made each team member 3x more productive." Phase 3: "We need one-third as many team members." The math is not subtle.
Automation
The use of technology to perform tasks without human intervention, a concept that has been eliminating jobs and creating new ones since the invention of the water wheel. Each wave of automation is greeted with the same two responses: "This time is different" (from the anxious) and "This time is not different" (from economists who will not be automated). Both are partially correct, which is the most unsatisfying possible outcome.
Example: ATMs were supposed to eliminate bank tellers. Instead, they made branches cheaper to operate, so banks opened more branches and hired more tellers. This is the kind of outcome that makes economists insufferable at parties.
Automation Anxiety
The fear that one's job will be automated, a condition that has existed since the Luddites smashed weaving looms in 1811 and that is currently experiencing its most widespread outbreak since the invention of the spreadsheet eliminated 94% of the world's human calculators. Automation anxiety is most acute among workers whose tasks are routine and most denied among workers who believe their tasks are not routine but are wrong.
Example: "A machine could never do my job," said the worker whose job consisted of four steps that could be described in a flowchart.
Ban vs Embrace Debate
The binary framing adopted by educational institutions when confronted with AI, in which the only perceived options are total prohibition or uncritical adoption, as if the only responses to fire were "ban it" or "give it to toddlers." The Ban vs Embrace Debate has occupied more faculty meeting hours than any pedagogical question since "should students be allowed to use calculators," which was resolved by calculators becoming phones.
Example: 2023: "Should we ban ChatGPT?" 2024: "Should we integrate ChatGPT?" 2025: "Our students' ChatGPT usage exceeds our faculty's." 2026: "What is ChatGPT?" — from the next AI tool's perspective.
Beast Allegory Mapping
The systematic assignment of mythical creatures to the real-world phenomena they represent, conducted with the same rigor a cartographer applies to coastlines. Dragons map to disruptive technologies. Unicorns map to overhyped startups. Ostriches map to institutions in denial. The mapping is not subtle. Subtlety is for fields with smaller problems.
Example: If a creature breathes fire and destroys villages, it maps to AI. If a creature buries its head in sand, it maps to your school board.
Beast Classification System
A formal framework for categorizing mythical creatures by attributes such as habitat, disposition, flight capability, and Series B valuation. The beast classification system used in this textbook assigns each creature to an allegorical function (see: Beast Allegory Mapping), ensuring that no mythical animal is merely decorative. Decorative animals belong in children's literature. This is a serious academic text.
Example: The dragon is classified as Class IV: Disruptive Megafauna. The unicorn is Class II: Aspirational Equidae. The committee-formed-to-study-AI is Class VII: Institutional Invertebrate.
Bestiary Tradition
The medieval practice of cataloguing real and imagined animals alongside moral lessons, producing reference works of dubious zoological value but considerable entertainment value — a tradition continued faithfully by this textbook and, less intentionally, by Gartner's Magic Quadrant reports.
Example: The Aberdeen Bestiary (c. 1200) describes the bonnacon, a bull whose primary defense is explosive diarrhea projected over three acres. Modern bestiaries describe blockchain with similar enthusiasm and comparable utility.
Billion Dollar Valuation
A numerical threshold ($1,000,000,000) that transforms a private company from "a business" into "a unicorn," despite no change in its actual revenue, product quality, or ability to turn a profit. The valuation is determined by multiplying the number of users who have never paid for anything by a number a venture capitalist found spiritually compelling.
Example: The company had 50 million users, zero revenue, and a $3.2 billion valuation. The math is not the point. The math was never the point.
Bitcoin
A decentralized digital currency created in 2009 by the pseudonymous Satoshi Nakamoto, whose identity remains unknown, making Bitcoin the only major financial system whose founder cannot be subpoenaed. Bitcoin consumes more electricity annually than many countries, a fact that is either deeply troubling or "the cost of financial freedom," depending on whether you own any.
Example: Bitcoin was invented to decentralize finance. It is now primarily used to recentralize finance in the hands of people who bought Bitcoin early, which is a different power structure but not a different structure.
Biting Satire
Satire that is sharp enough to cause discomfort in its targets, as opposed to gentle satire, which makes everyone feel clever without challenging anyone's assumptions. Biting satire risks offense because it names specific failures, institutional cowardice, and collective delusions. This textbook practices biting satire. If you have read this far without feeling slightly attacked, you may not be the target audience. Or you may be the target, in which case: further research is needed.
Example: Gentle satire: "Technology companies sometimes exaggerate." Biting satire: "A company valued at $8 billion has 47 employees, no revenue, and a product that is a more expensive version of email."
Blockchain
A distributed, immutable ledger technology that records transactions across multiple computers, ensuring that no single entity can alter the record — a solution so elegant that the technology industry has spent fifteen years searching for a problem worthy of it. Blockchain is the answer. The question remains under development.
Example: "We should put it on the blockchain" is a sentence that has been spoken in every boardroom in America and has resulted in a useful product in approximately 0.4% of cases.
Breakthrough Announcement
A press release or media event declaring that a significant technological barrier has been overcome, issued with sufficient frequency that the word "breakthrough" has been drained of all meaning through repetition. A genuine breakthrough occurs approximately once per decade. Breakthrough announcements occur approximately once per day. The ratio between these figures explains much about the technology industry.
Example: "Breakthrough" used correctly: the discovery of penicillin. "Breakthrough" used normally: a 1.7% improvement in benchmark performance on a dataset nobody uses in production.
Carbon-Neutral Magic
The accounting practice by which energy-intensive technologies claim environmental neutrality through carbon offsets, renewable energy credits, or creative redefinition of the word "neutral." Carbon-neutral magic transforms a data center that consumes the electrical output of a small city into a "sustainable" operation through a process that involves neither reducing consumption nor generating clean energy but instead purchasing certificates from someone who planted trees somewhere.
Example: The company achieved "carbon neutrality" by purchasing offsets from a forest that was not going to be cut down anyway. The net environmental impact was one press release.
Centaur
A being that is half human and half horse, representing the ideal of human-AI collaboration where the human provides wisdom and the AI provides the ability to process 400 million documents per second. In practice, most centaur arrangements involve the human providing the coffee and the AI providing the deliverables.
Example: "We're building a centaur workforce," the CEO announced, meaning half the employees would be replaced by the horse half.
Centaur Workforce
A labor model in which human workers are augmented by AI tools, combining human judgment with machine processing power to create a hybrid entity that is theoretically more capable than either component alone. The term references the mythological centaur (half human, half horse) and carries the same implicit question: where does the human end and the tool begin? In mythology, the answer was the waist. In the modern workforce, the answer is "wherever the org chart says."
Example: The centaur radiologist uses AI to flag potential anomalies and human expertise to interpret them. The question nobody asks the centaur radiologist is: "What happens when the AI can also interpret them?"
Change Management
The discipline of transitioning individuals, teams, and organizations from a current state to a desired future state, a process that organizational theorists describe in five phases and practitioners describe as "herding cats through a car wash." Change management in the context of AI adoption is complicated by the fact that the "desired future state" changes faster than the change management process can respond to it.
Example: The change management plan was approved in Q1 2024. By Q3 2024, the technology it was designed to manage had been superseded by three successor technologies, two of which had already entered the Trough of Disillusionment.
Character as Metaphor
A literary device in which a fictional character embodies an abstract concept, institution, or cultural phenomenon, allowing the reader to develop feelings about macroeconomic forces, which is not normally possible. Every mythical beast in this textbook is a character-as-metaphor. Some readers will form emotional attachments to the dragon. Those readers should examine what this reveals about their relationship with disruption.
Example: Sparkle the Unicorn is a metaphor for the entire AI hype industry. Sparkle is aware of this and finds it beneath comment.
Character Development
The process by which a fictional character becomes more complex, nuanced, and believable over the course of a narrative. In this textbook, character development applies to mythical beasts who begin as allegories and gradually acquire personality, motivation, and the kind of workplace frustration that can only come from attending one's 47th meeting about "aligning on alignment." Characters develop. Committees do not.
Example: In Chapter 1, the dragon is a symbol of disruption. By Chapter 12, the dragon is a fully realized character who is tired of being a symbol of disruption and would like to discuss its pension.
Chatbot
A software application designed to simulate conversation with human users, ranging in sophistication from "choose from these three options" to "I have composed a villanelle about your quarterly revenue shortfall." The chatbot is the most visible consumer interface for AI and therefore the most likely to be blamed when AI fails and credited when AI succeeds, regardless of what is happening behind the interface.
Example: Customer: "I need to cancel my subscription." Chatbot: "I understand you'd like to learn about our premium tier."
Claim Verification
The act of independently confirming the truth of a statement before repeating it, sharing it, or investing money based on it. Claim verification is the single most effective defense against misinformation, hype, and the forward-looking statements in an investor pitch deck. It is also the single most rarely practiced defense, because claim verification takes time and conviction spreads at the speed of a retweet.
Example: The press release said "Our AI is 99% accurate." Claim verification asks: Accurate at what? Measured how? Compared to what? Under what conditions? By whom? The press release does not say. This is the point of claim verification.
Coding Bootcamp Myth
The widely held belief that a 12-week intensive programming course can transform any motivated individual into a software engineer, a claim that is approximately as credible as the belief that a 12-week intensive unicorn-riding course can transform any motivated individual into a knight. Coding bootcamps produce graduates. The market produces employment for some of them. The gap between these facts is filled by marketing.
Example: The bootcamp promised a $95,000 starting salary. The fine print defined "starting salary" as "the salary at which graduates start looking for jobs that pay $95,000."
Committee Paralysis
The state in which a committee formed to address a problem becomes the primary obstacle to addressing it, through a process of procedural accretion that transforms urgency into agenda items and agenda items into tabled motions. Symptoms include excellent pastries, comprehensive meeting minutes, and zero actionable recommendations. The condition is terminal in 94.7% of cases.
Example: The AI Strategy Committee met 47 times over 18 months. Its final recommendation was to form a subcommittee to study the feasibility of making recommendations. The pastries were excellent throughout.
Confirmation Bias
The tendency to seek, interpret, and recall information in ways that confirm one's pre-existing beliefs, a cognitive bias so pervasive that you are almost certainly exhibiting it while reading this definition. Confirmation bias explains why AI optimists and AI pessimists can read the same research paper and reach opposite conclusions, both with complete sincerity. The bias is not the problem. The sincerity is.
Example: The VC read a study showing AI could automate 30% of tasks. The VC concluded: "AI will automate everything." The worker read the same study. The worker concluded: "My job is safe — I'm in the other 70%." Both are exhibiting confirmation bias. Both are comfortable.
Creature Characteristics
The observable traits used to identify and classify mythical beasts, including but not limited to: number of heads, fire-breathing capacity, wing-to-body ratio, and willingness to appear in a pitch deck. Creature characteristics must be empirically verifiable, a requirement that presents obvious challenges when the creature itself is not empirically verifiable.
Example: The phoenix's primary characteristic is cyclical self-immolation. The startup founder's primary characteristic is also cyclical self-immolation, though they call it "serial entrepreneurship."
Critical Thinking
The disciplined process of analyzing, evaluating, and synthesizing information to reach well-reasoned conclusions, a skill that every educational institution claims to teach and approximately 11.3% of them actually teach. Critical thinking is the ability to read a claim and ask "what is the evidence for this?" — a question so dangerous to established power structures that entire industries depend on people not asking it.
Example: Critical thinking applied to this textbook: "Is this a serious academic text about unicorns, or is it satire?" Critical thinking applied more deeply: "Does it matter, if the observations about AI and education are accurate either way?"
Cryptocurrency
A digital or virtual currency secured by cryptography and maintained on a blockchain, designed to operate without central bank control and, in practice, operating without central anything, including consumer protection, price stability, or the ability to explain to your mother what it is. There are currently over 22,000 cryptocurrencies, a number that would alarm any economist and delight any collector.
Example: "It's like money, but on the internet, but not like the money that's already on the internet, but different, but also the same, but decentralized, but also there are exchanges, but —" [conversation ends]
Cyclops
A one-eyed giant of considerable strength and limited depth perception, serving as the mythological archetype for any organization that possesses enormous resources but can only focus on one thing at a time. The Cyclops forged thunderbolts for Zeus; modern cyclopes forge quarterly earnings reports for shareholders. Both believe they are doing the most important work.
Example: The company had $4 billion in revenue and could not figure out how to update its website's contact form.
Dark Humor
Comedy that derives its effect from subjects typically considered serious, taboo, or distressing, such as job displacement, institutional failure, and the possibility that the reader's career will be automated within the decade. Dark humor is not cruel. It is the recognition that some truths are too uncomfortable to deliver straight and too important to leave undelivered. The joke is the spoonful of sugar. The medicine is the point.
Example: "The good news is that AI won't replace all jobs. The bad news is that it will replace yours specifically." This is dark humor. It may also be a performance review.
De-hornification
The reverse of hornification: the process by which a unicorn startup loses its billion-dollar valuation and returns to the status of a regular horse. De-hornification typically occurs during market downturns, interest rate increases, or the moment when someone examines the unit economics. De-hornification is painful, public, and rarely discussed by the investors who facilitated the original hornification.
Example: The company was valued at $3.2 billion in 2021 and $400 million in 2023. The horn fell off during the down round. The press called it a "valuation adjustment." The employees called it "Tuesday."
Deadpan Delivery
A comedic technique in which humorous material is presented without any indication that it is intended to be funny, forcing the audience to do the cognitive work of recognizing the joke. Deadpan delivery is the preferred register of this textbook and of Sparkle the Unicorn, who has never told a joke in her life and does not intend to start. The data is unambiguous.
Example: "The committee's report was thorough, well-researched, and completely ignored. This outcome was consistent with all available models." — A deadpan observation that is also a factual statement. The humor and the truth occupy the same sentence. This is the point.
Deer in Headlights Effect
The paralytic response exhibited by individuals or institutions confronted with technological change so rapid and fundamental that neither fight nor flight seems viable, resulting in a frozen state of inaction accompanied by wide eyes and the repeated phrase "we need more data." The deer in headlights effect is distinguished from the ostrich response by the presence of awareness: the deer sees the headlights. The deer simply cannot move.
Example: The middle manager attended three AI conferences, subscribed to four AI newsletters, bookmarked seventeen AI tools, and implemented zero changes to any workflow.
Digital Literacy
The ability to find, evaluate, create, and communicate information using digital technologies, a skill set that educational institutions have been "planning to integrate" into curricula since approximately 2003. Digital literacy is distinguished from computer literacy (knowing how to use a computer) and from AI literacy (knowing how to evaluate AI claims) by the fact that it has been the subject of more task forces and fewer actual curriculum changes than either.
Example: A digitally literate person can identify a phishing email. A digitally illiterate institution can fall for one that begins "Dear Esteemed University, you have won a grant."
Disruption Narrative
A story framework in which an innovative newcomer (typically small, agile, and disdainful of neckties) overthrows an established incumbent (typically large, slow, and decorated with mahogany) through superior technology and sheer audacity. The disruption narrative is the hero's journey of capitalism, and like the hero's journey, it is far more common in stories than in reality.
Example: Uber disrupted taxis. Airbnb disrupted hotels. Both disrupted labor laws. The narrative focuses on the first two.
Distributed Ledger
A database that is consensually shared and synchronized across multiple sites, institutions, or geographies, accessible by multiple people, and maintained without a central administrator, which sounds revolutionary until you realize it describes a Google Sheet with sharing permissions. The distinction between a distributed ledger and a shared database is real, technical, and almost never relevant to the use case being proposed.
Example: "It's not a database. It's a distributed ledger." — A sentence that adds $2 million to any startup's valuation.
Dragon
A large, fire-breathing reptilian entity whose primary economic function is the destruction of existing infrastructure, employment, and the comforting illusion that your job is safe. Dragons are classified as "disruptive technologies" in modern taxonomy, though the medieval classification of "existential threat to villages" remains equally accurate.
Example: Generative AI entered the writing industry the way a dragon enters a thatched-roof village: suddenly, thoroughly, and with minimal regard for the residents' five-year plans.
Education Technology Gap
The distance between the technology students use in their daily lives and the technology their schools use to teach them, measured in years, decades, or in extreme cases, centuries. The education technology gap is maintained through a combination of budget constraints, procurement processes designed to prevent rapid adoption, and a philosophical commitment to the belief that the overhead projector represents the apex of instructional technology.
Example: The student used an AI assistant to research, outline, and draft their paper at home, then submitted it on a Scantron form at school.
Energy Consumption Paradox
The contradiction between the technology industry's stated commitment to sustainability and its actual energy consumption, which increases with each new AI model, each new data center, and each new blockchain transaction. The energy consumption paradox is resolved in corporate communications through the word "efficiency," which refers to the ratio of useful output to energy input and which improves continuously without reducing total energy consumption, because the total amount of computing also increases continuously. This is known as the Jevons paradox. It is also known as "the thing nobody wants to talk about at the sustainability conference."
Example: The new AI model is 40% more energy-efficient per query. It is also used for 300% more queries. Total energy consumption increased 140%. The press release mentioned only the 40%.
Ethical Bitcoin Paradox
The logical contradiction inherent in describing a system that consumes 150 terawatt-hours of electricity annually — more than the nation of Argentina — as compatible with environmental sustainability. The ethical Bitcoin paradox is resolved by proponents through the assertion that Bitcoin will transition to renewable energy, a claim that is technically possible, currently untrue, and structurally identical to the assertion that unicorns will transition from mythical to real.
Example: "Bitcoin will go green," said the miner, whose facility was powered by a coal plant. "Eventually." The timeline was not specified. See: Five Years Away Syndrome.
Fable
A short narrative featuring anthropomorphized animals that conveys a moral lesson, distinguishable from a venture capital pitch primarily by its brevity and the fact that the moral is stated explicitly rather than discovered during bankruptcy proceedings.
Example: The Tortoise and the Hare teaches patience. The Startup and the Market teaches burn rate.
Fact vs Fiction
The distinction between verifiable reality and invented narrative, a distinction that is becoming increasingly difficult to maintain in an era when AI can generate convincing text, images, and video, and when public figures can disseminate claims with equal reach and unequal accountability. The fact-fiction distinction is the most important epistemic tool humanity has, and it is under more pressure than at any point since the invention of the printing press.
Example: Fact: AI can generate text. Fiction: AI understands text. The distinction is load-bearing. The marketing department disagrees.
Five Years Away Syndrome
A chronic condition affecting emerging technologies in which the predicted timeline to practical, widespread application remains permanently fixed at "about five years," regardless of the passage of time. First documented in fusion energy research in the 1950s, Five Years Away Syndrome has since been diagnosed in quantum computing, self-driving cars, and artificial general intelligence. There is no known cure. Further research is needed, and will take approximately five years.
Example: Fusion power has been five years away since 1955. Self-driving cars have been five years away since 2015. AGI has been five years away since the speaker's most recent funding round.
Fusion Power
Energy generated by fusing atomic nuclei, the process that powers the sun, and which has been "20 years away from commercial viability" since 1955, making it the original and most distinguished member of the Five Years Away Syndrome family. Fusion power promises unlimited clean energy, and it will deliver on this promise approximately 20 years from whenever you are reading this sentence.
Example: Every generation is told that fusion will solve energy problems within their lifetime. Every generation passes this promise to the next, like a family heirloom that does not work.
Future of Assessment
The unresolved question of how educational institutions will evaluate student learning when AI can produce any written assignment, pass any standardized test, and generate any creative artifact faster and more consistently than most students. The future of assessment is a topic of intense debate among educators, 97.3% of whom agree that "we need to rethink assessment" and 2.7% of whom have actually rethought it.
Example: "How do we know the student learned anything?" is a question that, post-AI, applies equally to a take-home essay and an in-class exam, because the student's AI is on their phone and the phone is in their pocket.
Generative AI
Artificial intelligence systems capable of producing text, images, code, and music, thereby automating the one category of work that humans were certain could never be automated. Generative AI creates novel content by recombining patterns from its training data, a process that is either "creative synthesis" or "very sophisticated plagiarism," depending on whether you are selling or competing with it.
Example: A generative AI can write a sonnet, paint a landscape, compose a symphony, and generate a press release claiming it has achieved consciousness. Only one of these outputs should concern you.
Graphic Novel
A book-length work of sequential art that combines visual storytelling with text, distinguished from a comic book by binding quality and the willingness of English professors to assign it. In this textbook, graphic novels feature mythical beasts in workplace scenarios that are immediately recognizable to anyone who has attended a status meeting, received a reorganization email, or been told that the layoffs are "an opportunity."
Example: A dragon wearing a lanyard delivers a presentation titled "Rightsizing for the Future." The graphic novel format allows you to see both the dragon's PowerPoint slides and the employees' faces. Both are devastating.
Griffin
A hybrid creature possessing the body of a lion and the head and wings of an eagle — a combination that, like most corporate mergers, looks impressive on paper but raises serious questions about internal organ compatibility. The griffin represents the mythological precedent for the modern "synergy" pitch.
Example: The AOL-Time Warner merger was, in retrospect, less griffin and more chimera: multiple incompatible parts sutured together by investment bankers.
Hornification
The process by which a standard equine acquires a conical keratin protrusion, typically through Series B funding, favorable press coverage, and the collective willingness of investors to value the protrusion at approximately $1 billion. Hornification is reversible: the horn may be lost through IPO, market correction, or the discovery that the horn was a party hat all along. De-hornification is not discussed in polite company.
Example: The company hornified in Q3 2021, achieving a $2.4 billion valuation on $6 million in annual revenue. The horn-to-revenue ratio of 400:1 raised no concerns among investors, who were at the time experiencing a condition known as "zero interest rate enthusiasm."
Bonus Terms
Human-AI Collaboration
A working arrangement in which humans and AI systems contribute complementary capabilities to achieve outcomes neither could accomplish alone, also known as "the human checks the AI's work and adds the parts that require empathy, judgment, and the ability to read a room." Human-AI collaboration is the optimistic framing of a relationship that is currently being negotiated without the human's informed consent.
Example: The human provides context, nuance, and ethical judgment. The AI provides speed, scale, and the ability to process 400,000 documents overnight. The human also provides the coffee. The division of labor is not yet equitable.
Hype Cycle
A graphical model developed by Gartner depicting the maturation of emerging technologies through five phases, from wild enthusiasm to bitter disappointment to grudging utility. The hype cycle is itself subject to hype, having been cited so frequently that it has become the one Gartner framework that people use without paying for a Gartner subscription, which is ironic in ways Gartner does not officially acknowledge.
Example: Blockchain entered the Peak of Inflated Expectations in 2017 and has been sliding toward the Trough of Disillusionment ever since, pausing occasionally to announce another use case no one asked for.
Hype Cycle Visualization
A graphical representation of the Gartner Hype Cycle applied to a specific technology, allowing students to visually identify where a technology currently resides on the curve and to make predictions about its future that will be exactly as accurate as all previous predictions, which is to say, not very. The hype cycle visualization is most useful as a historical tool and most dangerous as a predictive one.
Example: Plotting AI on the hype cycle in 2024 is an exercise in which every student places the dot in a different location, each with compelling justification. This is the exercise's actual lesson.
Institutional Resistance
The tendency of established organizations to reject, delay, or passively obstruct changes that threaten existing power structures, workflows, or the ability to use the phrase "we've always done it this way" without being contradicted. Institutional resistance to AI follows the same five-stage pattern as grief: denial, anger, bargaining ("can we just use it for attendance?"), depression, and acceptance (typically achieved 3-5 years after the students have already adapted).
Example: The university formed a task force on AI in 2023. The task force recommended forming a committee. The committee recommended a pilot program. The pilot program was approved for 2026. The students have been using AI daily since 2022.
Intelligent Textbook
A textbook that incorporates interactive elements, adaptive content, and AI-driven personalization to create a learning experience that adjusts to the student's needs, as opposed to a traditional textbook, which adjusts to nothing and weighs eleven pounds. The "intelligent" in intelligent textbook refers to the technology embedded within it, not to any judgment about the intelligence of traditional textbooks, though the implication is noted.
Example: An intelligent textbook notices you have been staring at the same page for twelve minutes and offers a simpler explanation. A traditional textbook does not notice you. A traditional textbook will never notice you.
Interactive Simulation
A digital model that allows users to manipulate variables and observe outcomes in real time, used in educational contexts to demonstrate complex systems and in venture capital contexts to demonstrate whatever the investor wants to see. Interactive simulations are powerful learning tools because they make abstract concepts tangible. They are also powerful deception tools, for exactly the same reason.
Example: An interactive simulation of unicorn population dynamics demonstrates exponential growth, carrying capacity, and the effects of Series B funding on horn length. The simulation is pedagogically sound. The subject matter is imaginary. These facts coexist peacefully.
Investor Pitch Deck
A presentation of 10-15 slides designed to convince venture capitalists to invest millions of dollars in a company, a format that compresses the complexity of an entire business into fewer pages than a restaurant menu. The pitch deck is the haiku of capitalism: constrained in form, boundless in ambition, and frequently composed under the influence of caffeine and delusion in equal measure.
Example: Slide 1: The Problem (massive). Slide 2: The Solution (our product). Slide 3: The Market ($4.7 trillion). Slide 4: The Team (we went to Stanford). Slide 5: The Ask ($10 million for 8%). Slides 6-15: Graphs going up and to the right.
Irony
A rhetorical device in which the intended meaning is opposite to the literal meaning, or in which the outcome of events contradicts expectations. Irony is the engine of this textbook: a book about mythical beasts that is really about real technologies, written by AI to satirize AI, and published in a format that it satirizes. If you can identify all layers of irony in the previous sentence, you may qualify for a PhD in Unicorn Studies.
Example: An AI-generated textbook criticizing AI-generated content is ironic. Readers who miss the irony and cite this textbook as a genuine academic source would constitute a deeper layer of irony. Both outcomes are statistically likely.
Job Displacement
The elimination of human employment positions through automation, technological change, or the CEO's attendance at a conference where a vendor demonstrated an AI that can "do what your team does, but faster." Job displacement is a real and serious consequence of technological progress that is discussed with exactly two tones: existential terror (by those whose jobs are threatened) and detached optimism (by those whose jobs are doing the threatening).
Example: The company replaced 40 customer service representatives with an AI chatbot. The chatbot cannot de-escalate a crying customer. The company considers this a feature.
Kraken
A colossal cephalopod of the deep ocean, capable of dragging entire ships beneath the waves. In modern usage, a metaphor for technical debt that lurks beneath the surface of a codebase, unseen by management until it pulls the entire release schedule into the abyss during the week before launch.
Example: "We can ship by Friday," said the engineer who had not yet looked at the database migration scripts.
Large Language Model
A neural network trained on vast quantities of text that predicts the next token in a sequence with sufficient accuracy to produce coherent paragraphs, pass bar exams, and terrify everyone who writes for a living. The "large" in the name refers to parameter count, training data, compute costs, and the existential questions it raises. It does not refer to the model's understanding of what it is saying, which remains a matter of vigorous debate and considerable denial.
Example: This glossary entry was written by a human. Or was it? The fact that you cannot tell is the entire point of the preceding 60 words.
LinkedIn Skill Inflation
The phenomenon in which professionals list increasingly numerous and increasingly implausible skills on their LinkedIn profiles, driven by algorithmic incentives that reward keyword density over accuracy. LinkedIn skill inflation has produced profiles in which a single individual claims expertise in 47 distinct technologies, 12 leadership frameworks, and "strategic visioning," a term that means nothing and is endorsed by 99 people.
Example: Skills listed: Machine Learning, Deep Learning, AI Strategy, Digital Transformation, Change Management, Thought Leadership, Unicorn Husbandry. Skills used daily: Email, Excel, avoiding eye contact in meetings.
Logical Fallacy
An error in reasoning that renders an argument invalid, regardless of whether its conclusion happens to be true. There are over 100 named logical fallacies, and the technology industry employs most of them on a quarterly basis. The most common in AI discourse are appeal to novelty ("it's new, therefore it's good"), appeal to authority ("Elon said so"), and the fallacy of the excluded middle ("either you embrace AI completely or you're a Luddite").
Example: "Every major technology was doubted at first, and AI is being doubted, therefore AI will succeed." This is an affirming-the-consequent fallacy. Many doubted technologies failed. The ones that failed don't get quoted in pitch decks.
Machine Learning
A subset of artificial intelligence in which algorithms improve through exposure to data rather than explicit programming, mirroring the human learning process except faster, cheaper, and without the student loans. Machine learning systems learn patterns from historical data, which means they are exceptionally good at perpetuating whatever biases were present in the past.
Example: A machine learning model trained on historical hiring data learned to replicate every bias in historical hiring. It was extremely accurate. This was the problem.
Magical Thinking
The cognitive process of believing that desire, intention, or a sufficiently compelling slide deck can alter material reality. In anthropological contexts, magical thinking involves rituals and incantations. In Silicon Valley, it involves pitch decks and press releases. The structural similarity has been noted by scholars but not by participants.
Example: "If we build it, they will come" is magical thinking. "If we build it, demo it, get press coverage, and raise $50 million before anyone asks if it works, they will come" is a business plan.
Media Literacy
The ability to access, analyze, evaluate, and create media in a variety of forms, a skill set that is essential for navigating a media environment in which the same press release is republished as "news" by seventeen outlets simultaneously and a TikTok video has more epistemic authority than a peer-reviewed study. Media literacy is the difference between "I read it online" and "I verified it through multiple independent sources." Currently, the first approach is winning.
Example: A media-literate person reads "AI breakthrough" and checks the paper. A media-illiterate person reads "AI breakthrough" and updates their LinkedIn headline. Both are acting rationally given their respective information environments, which is the actual problem.
Medieval Unicorn Lore
The extensive body of European literature and art (c. 500-1500 CE) asserting that unicorns could be tamed only by virgins, a filtering mechanism roughly as effective as requiring three years of experience with a programming language released eighteen months ago.
Example: The Hunt of the Unicorn tapestries at the Met Cloisters depict seven stages of unicorn capture, which map precisely to the seven stages of a typical enterprise sales cycle.
Mermaid
An aquatic humanoid whose enchanting song lures sailors to their doom, functionally identical to a SaaS product demo. The mermaid's upper half promises beauty and companionship; the lower half is a fish. The data is unambiguous: this maps precisely to the relationship between a product's landing page and its actual feature set.
Example: "The demo was gorgeous. Then we tried to make it walk on land." — Anonymous enterprise customer, Q3 review.
Meta-Fiction
Fiction that is self-consciously aware of itself as fiction, drawing attention to its own construction, conventions, and artificiality. This textbook is meta-fiction: an AI-generated textbook about AI-generated content, written in the style of a textbook to satirize textbooks, containing this glossary entry about meta-fiction. If this feels recursive, that is because it is. The recursion is the point. The recursion is always the point.
Example: This sentence is an example of meta-fiction because it is aware of itself as an example. The glossary knows it is a glossary. You know the glossary knows. The glossary knows you know. We can stop here.
Metaverse
A persistent, shared, three-dimensional virtual world in which users interact through digital avatars, conduct business, and experience reality in a form that is distinguished from actual reality primarily by lower resolution and higher expectations. The metaverse was announced as the future of human interaction in 2021 and reclassified as a $36 billion write-off in 2023, achieving a hype-to-value ratio that may never be surpassed.
Example: The CEO renamed the entire company after the metaverse. The metaverse's most popular activity turned out to be attending virtual meetings that could have been emails, mirroring the physical world with uncomfortable precision.
Middle Manager Dilemma
The existential crisis facing mid-level organizational leaders whose primary functions — information routing, status reporting, meeting scheduling, and the gentle enforcement of vague directives — are precisely the functions most efficiently performed by AI. The middle manager dilemma is not whether AI can do the job, but whether anyone will notice the difference, a question that middle managers have been anxiously avoiding since 2023.
Example: The middle manager's value proposition is "I synthesize information from multiple sources and communicate it upward." This is also the value proposition of a dashboard.
Minotaur
A bull-headed humanoid confined to an elaborate labyrinth, serving as the mythological prototype for the modern enterprise software user trapped in a legacy system's navigation menu. The Minotaur did not choose the labyrinth. Neither did you choose SAP.
Example: Theseus needed a ball of thread to escape the labyrinth. Modern employees need a 47-page PDF titled "Quick Start Guide."
Modern Unicorn Culture
The contemporary cultural phenomenon in which the unicorn has been stripped of all mythological gravitas and reassigned to represent glitter, positivity, and consumer products marketed to people who describe themselves as "quirky." Simultaneously, the term designates billion-dollar private companies, creating the only context in which Lisa Frank and Goldman Sachs occupy the same semantic field.
Example: You can purchase a unicorn-themed pool float and a unicorn-valued stock portfolio on the same phone, during the same lunch break.
Moving the Goalposts
The rhetorical practice of redefining what constitutes "real" AI whenever current AI achieves what was previously considered the threshold. In 1997, beating the world chess champion would prove AI. It did not. In 2016, beating the world Go champion would prove AI. It did not. In 2023, passing the bar exam would prove AI. The goalposts are now visible only through the James Webb Space Telescope.
Example: "When AI can write a novel, then I'll be impressed." AI writes a novel. "I meant a GOOD novel." AI wins a literary prize. "I meant a prize I've heard of."
Mythical Beast
A creature whose existence is supported by approximately the same weight of evidence as most Series A pitch decks. Mythical beasts persist across cultures because every civilization needs a metaphor for things that sound magnificent but cannot be reliably observed in production environments.
Example: The unicorn, the dragon, and the startup that achieved profitability in its first year.
Mythical Product-Market Fit
The legendary condition in which a product perfectly satisfies a genuine market need, reportedly observed by founders in the wild approximately as often as unicorns are observed by medieval peasants. Product-market fit is most commonly "achieved" retroactively, in the founder's memoir, after the company has been acquired for parts.
Example: "We always had product-market fit. The market just didn't know it yet." — Founder whose company had 12 users, 11 of whom were employees.
Mythical ROI
The return on investment projected in a pitch deck for a product that does not yet exist, calculated using assumptions that would be generous if applied to a product that does exist. Mythical ROI is expressed in percentages that increase as the probability of achieving them decreases, forming a curve that mathematics has not yet named but that this textbook proposes calling the "inverse plausibility function."
Example: Projected ROI: 4,700% over 36 months. Actual ROI: -100% over 18 months, at which point the company pivoted to a different mythical ROI.
Mythical vs Real Creatures
A distinction that appears straightforward until one attempts to apply it consistently. A horse is real. A unicorn is mythical. A horse with a party hat is real but aspirational. A startup valued at $4 billion with no revenue is classified differently depending on who is asking and whether they have invested.
Example: Real: horses, revenue, gravity. Mythical: unicorns, product-market fit, "we'll be profitable next quarter."
Narrative Arc
The structural progression of a story from beginning through conflict to resolution, typically described as rising action, climax, and falling action. The narrative arc of a technology hype cycle follows the same structure: rising expectations, peak hype, and falling stock price. Both are satisfying to study in retrospect and painful to experience in real time.
Example: Act 1: "This technology will change the world." Act 2: "This technology has some challenges." Act 3: "This technology has been acquired by Oracle." The narrative arc is complete.
Narrative Archetype
A recurring story pattern that appears across cultures and centuries, suggesting either universal human truths or a persistent lack of originality. The hero's journey, the trickster, the quest — all are narrative archetypes. "Founder drops out of college to build something in a garage" is the Silicon Valley variant, and it is told with precisely the same reverence as any myth.
Example: The monomyth: departure, initiation, return. The startup myth: founding, funding, acquisition (or departure, initiation, bankruptcy).
Natural Language Processing
The subfield of AI concerned with enabling computers to understand, interpret, and generate human language, a goal that has been "nearly achieved" every decade since the 1950s. NLP has progressed from keyword matching to transformer architectures, which is genuinely impressive and also exactly the kind of sentence that makes normal people stop reading.
Example: NLP allows your phone to autocorrect "meeting" to "meatloaf" with near-perfect consistency.
Neural Network
A computing architecture inspired by the human brain, in the same way that an airplane is inspired by a bird: the metaphor is structurally useful and biologically misleading. Neural networks consist of layers of interconnected nodes that process information through weighted connections. They do not think, feel, or deserve a salary, despite what the marketing department implies.
Example: A neural network has neurons the way a hot dog has dogs.
Ostrich Response
An institutional coping mechanism characterized by the willful refusal to acknowledge the existence, relevance, or implications of a transformative change, typically accompanied by the assertion that "this doesn't affect us" or "it's just a fad," spoken with the confidence of a person standing in the path of a glacier. Named for the ostrich's apocryphal habit of burying its head in sand, which the ostrich does not actually do, making the metaphor doubly appropriate for institutions that are wrong about how they handle being wrong.
Example: "AI can't do what our teachers do," said the administrator, having never used AI and having never asked what AI does.
Parody
A creative work that imitates the style or conventions of another work, genre, or institution for comic effect. This textbook is a parody of intelligent textbooks, interactive learning platforms, and the entire educational publishing industry. It is indistinguishable from the thing it parodies, which is either the highest form of parody or evidence that the thing being parodied was already absurd.
Example: This glossary is a parody of an academic glossary. It follows every convention of an academic glossary. The terms are real. The definitions are accurate. The tone is what makes it a parody, and also what makes it more honest than most academic glossaries.
Peak of Inflated Expectations
The second phase of the Gartner Hype Cycle, in which a technology receives maximum media attention, maximum venture capital, and minimum scrutiny. During this phase, the technology is described as "transformative," "revolutionary," and "the future of everything," claims that are supported by demo videos and contradicted by the technology's inability to function in rain.
Example: In 2021, the metaverse was at the Peak of Inflated Expectations. By 2023, it was a $36 billion write-off with cartoon legs.
Pegasus
A winged horse capable of flight, representing any technology that promises to transcend the fundamental limitations of its category. Pegasus is a horse that flies. Blockchain is a database that is somehow also a revolution. The structural parallel is exact.
Example: "It's like a regular spreadsheet, but it flies." — Approximately 94.7% of enterprise software pitches, paraphrased.
Performance Review Paradox
The contradiction inherent in evaluating employee performance using AI-generated metrics in an era when employee output is increasingly AI-generated, creating a feedback loop in which AI evaluates AI's work and attributes it to a human who receives a raise or a warning based on the assessment. The performance review paradox will not be resolved. It will be automated.
Example: The employee used AI to write the report. The manager used AI to evaluate the report. HR used AI to calibrate the evaluation. The employee received a rating of "meets expectations." Nobody's expectations were consulted.
Perpetual Beta
A software release strategy in which a product remains in "beta" (testing) status indefinitely, allowing the company to ship incomplete features while maintaining the defense that the product is not yet finished. Perpetual beta transforms "our product has bugs" into "our product is in beta," a linguistic maneuver that costs nothing and excuses everything. Gmail was in beta for five years. Some products have been in beta for longer than some employees have been alive.
Example: "It's still in beta" is the software industry's equivalent of "the check is in the mail."
Phoenix
A mythical bird that immolates itself and rises from its own ashes, much like a tech company that pivots after burning through $200 million. The phoenix's resurrection cycle averages 500 years; the average failed startup's "pivot to AI" takes approximately 3.7 months. Both claim the transformation was always the plan.
Example: "We didn't fail. We phoenixed." — Every founder's LinkedIn post after shutting down their original product.
Plateau of Productivity
The fifth and final phase of the Gartner Hype Cycle, in which a technology achieves mainstream adoption and becomes boring. The Plateau of Productivity is where technologies go to become useful and uninteresting, which is the highest compliment the market can pay and the reason no venture capitalist will return your calls about it.
Example: Email is on the Plateau of Productivity. Nobody gives a TED talk about email. Email works. This is email's crime.
Population Dynamics Model
A mathematical representation of how populations change over time due to birth, death, immigration, and predation, applied in this textbook to mythical beasts whose population dynamics are governed by venture capital funding cycles rather than ecological factors. The Lotka-Volterra equations describe predator-prey relationships. In the unicorn ecosystem, the predator is market correction and the prey is investor confidence.
Example: The unicorn population peaked in 2021, declined sharply in 2022 due to interest rate predation, and is currently experiencing a cautious recovery in the AI-adjacent habitat. The carrying capacity of the ecosystem is determined by the Federal Reserve.
Quantum Computing
A computing paradigm that harnesses quantum mechanical phenomena to process information, enabling calculations that would take classical computers billions of years to complete, assuming the quantum computer can maintain coherence for longer than the average attention span of a fruit fly. Quantum computing has been "five years away from practical application" since approximately 2001, establishing it as the most patient technology in history.
Example: A quantum computer can theoretically break all existing encryption. A quantum computer can practically operate for about 0.0001 seconds before decoherence sets in. Both facts are true. Only one appears in the press release.
Quantum Supremacy
The point at which a quantum computer performs a calculation that no classical computer can complete in a reasonable time, a milestone first claimed by Google in 2019 using a problem specifically designed to be easy for quantum computers and useless for everything else. Quantum supremacy is the technological equivalent of winning a race you organized, on a track you built, using rules you wrote, against a competitor who was not informed the race was happening.
Example: Google's quantum computer solved a problem in 200 seconds that would take a classical supercomputer 10,000 years. IBM disputed the 10,000-year estimate. Nobody disputed that the problem had no practical application.
Real or Fake Exercise
A pedagogical activity in which students are presented with claims, products, or press releases and asked to determine whether they are genuine or fabricated, an exercise that has become exponentially more difficult since the advent of generative AI and that was already quite difficult given the technology industry's longstanding commitment to claims that sound fabricated but are real. The real-or-fake exercise is the practical application of critical thinking, media literacy, and a healthy sense of existential dread.
Example: "An AI-generated image won an art competition." Real. "A blockchain-based system was used to track organic lettuce from farm to table." Real. "A company raised $100 million with no product." Real. The exercise is harder than it appears.
Reskilling
The process of learning entirely new skills to transition into a different occupation, typically recommended by people who have never had to do it. Reskilling is presented in policy documents as a smooth, empowering journey of personal growth and in reality as a terrifying, expensive, and uncertain process undertaken by adults who have mortgages and children and a 37-week window before their severance runs out.
Example: "Just reskill," said the CEO, who has had the same job title for twelve years and whose primary skill is saying things like "just reskill."
Satire
A literary form that uses humor, irony, and exaggeration to expose human folly, particularly the folly of people who are absolutely certain they are not the target. The effectiveness of satire is inversely proportional to the self-awareness of its subject. This textbook is, of course, not satire. It is a serious academic work.
Example: You are reading one.
Satirical Writing
A form of writing that uses humor, irony, and exaggeration to critique individuals, institutions, or society, typically adopted by authors who have strong opinions and weak punching arms. Satirical writing is distinguished from other forms of criticism by its entertainment value and from other forms of entertainment by its tendency to make the reader mildly uncomfortable. This textbook contains no satirical writing. This textbook is a serious academic work.
Example: Jonathan Swift proposed eating Irish children to solve poverty. This textbook proposes studying unicorns to understand AI. Both proposals are equally serious.
Self-Driving Car
An autonomous vehicle capable of navigating roads without human input, a technology that has been "almost ready" since approximately 2015 and that currently operates reliably in carefully mapped urban zones during good weather with a human safety driver who is technically not driving but is also not allowed to read a book. The self-driving car is the most expensive and most publicized demonstration of the gap between "works in a demo" and "works."
Example: The self-driving car navigated 1,000 miles of California highway flawlessly. It then encountered a construction cone in a configuration not present in its training data and stopped forever.
Series B Funding
The second major round of venture capital financing, typically occurring after a startup has demonstrated "traction" (a word that means whatever the pitch deck needs it to mean) and before it has demonstrated "profitability" (a word that means what it has always meant, which is why it is avoided). Series B is the round at which hornification most commonly occurs.
Example: Pre-Series B: "We're building something special." Post-Series B: "We're building something special, but now with a $200 million valuation and a ping-pong table."
Siren
A creature whose irresistible song compels listeners to abandon rational judgment and steer directly toward catastrophe. In contemporary usage, indistinguishable from a keynote presentation at any major technology conference. Odysseus survived by having his crew plug their ears with wax; modern attendees have no such protection.
Example: "This changes everything," the siren sang, and seventeen VCs opened their checkbooks simultaneously.
Skill Obsolescence
The process by which a previously valuable professional skill loses market relevance, typically over a period of years but occasionally over a period of a single OpenAI product launch. Skill obsolescence is a natural feature of economic progress and a deeply unnatural experience for the person whose skill has just been obsoleted. The phrase "lifelong learning" was coined to address this and sounds more cheerful than the reality warrants.
Example: In 2019, "prompt engineering" was not a skill. In 2023, it was a $300,000 job. In 2025, it was being automated. The entire lifecycle of this skill fits within one presidential term.
Slope of Enlightenment
The fourth phase of the Gartner Hype Cycle, during which practical applications of a technology emerge and realistic expectations replace hallucinated ones. The Slope of Enlightenment is the least discussed phase because it involves actual work, modest claims, and the kind of incremental progress that does not generate keynote invitations.
Example: During the Slope of Enlightenment, companies stop saying "AI will replace all workers" and start saying "AI reduced our invoice processing time by 23%." The second sentence is more valuable and infinitely less shareable.
Source Evaluation
The process of assessing the credibility, reliability, bias, and authority of an information source before accepting its claims, a practice that requires approximately 45 seconds and is performed by approximately 4.2% of internet users. Source evaluation is the intellectual equivalent of checking whether the milk has expired before drinking it: obvious, essential, and widely skipped.
Example: Source: peer-reviewed journal (credibility: high). Source: company press release (credibility: aspirational). Source: anonymous Reddit comment (credibility: depends on the subreddit, which is a sentence that should not be true but is).
Sparkle's Razor
The analytical principle that the simplest explanation for a technology claim is usually "the demo was curated," a heuristic developed by Sparkle the Unicorn and applicable to approximately 94.7% of keynote presentations. Sparkle's Razor does not assert that all demos are misleading. It asserts that the burden of proof rests with the demonstrator, not the audience. This is a minority position in the technology industry.
Example: "The AI wrote a complete application from a single sentence." Sparkle's Razor: "Show me the seventeen sentences that were tried first."
Startup Mythology
The collection of origin stories, founding legends, and heroic narratives that comprise the oral tradition of Silicon Valley, featuring recurring motifs such as the garage, the dropout, the pivot, and the inexplicable decision to name a company after a fruit. Startup mythology serves the same cultural function as Greek mythology: it provides aspirational models for behavior and convenient explanations for why some people have all the money.
Example: Every startup origin story involves a moment of revelation in which the founder saw a problem nobody else could see. In retrospect, the problem was usually visible to everyone. What nobody else had was a friend who was a venture capitalist.
Storyboarding
The process of planning a visual narrative by sketching key frames in sequence, used in film, animation, and graphic novel production. Storyboarding allows creators to identify pacing problems, narrative gaps, and scenes where a dragon in business casual is funnier as a wide shot than a close-up. Storyboarding is planning. The data is unambiguous: planning improves outcomes. The data is also unambiguous that most people skip it.
Example: Panel 1: Dragon enters office. Panel 2: Dragon presents quarterly earnings. Panel 3: Office is on fire. Panel 4: Dragon presents insurance claim. The storyboard ensures the comedic timing is intentional.
Symbolic Representation
The use of concrete images or characters to stand in for abstract concepts, a technique as old as literature itself and as current as the unicorn emoji in your company's Slack channel. Symbolic representation allows authors to discuss sensitive topics (such as "your entire industry is built on shared hallucination") without being directly confrontational.
Example: This textbook uses a unicorn to represent overhyped technology. The unicorn was not consulted and has retained counsel.
Taxonomy
The science of classification and naming, originally applied to biological organisms by Linnaeus and now applied to mythical beasts, vaporware, and meeting types with equal rigor. A good taxonomy imposes order on chaos. A great taxonomy reveals that the chaos was organized all along, just not in a way anyone wanted to acknowledge.
Example: Kingdom: Animalia. Phylum: Corporata. Class: Startupidae. Order: Hyperbolia. Family: Unicornidae. Genus: Vaporius. Species: V. slidedeckus.
Tech Press Release
A document issued by a technology company announcing an achievement, product, or partnership in language carefully calibrated to be technically defensible and emotionally misleading. The tech press release is the primary literary genre of Silicon Valley and observes conventions as rigid as the sonnet: 14 lines of enthusiasm, a quote from the CEO containing the word "excited," and a closing paragraph that nobody reads.
Example: "We are excited to announce our strategic partnership with [company] to leverage AI to transform [industry]." This sentence contains no information. It was not intended to.
Techno-Optimism Disorder
A cognitive condition characterized by the persistent belief that technology will solve all problems, including the problems created by technology, through a recursive process that requires no human behavior change, no regulatory intervention, and no acknowledgment that the previous technology was supposed to have already solved these problems. Techno-optimism disorder is not listed in the DSM-5 but is recognizable in approximately 100% of keynote speeches at CES.
Example: "AI will solve climate change," said the executive whose AI model consumed enough electricity to power a small city. The irony is not therapeutic but is well-documented.
Technology Fantasy
A widely shared belief about a technology's future capabilities that is unsupported by current evidence but sustained by collective enthusiasm, keynote presentations, and the human need to believe that tomorrow will be dramatically different from today. Technology fantasies are distinguishable from technology forecasts by their resistance to contradictory evidence. A forecast is updated when new data arrives. A fantasy is defended.
Example: The metaverse, flying cars, and paperless offices are technology fantasies. Email, smartphones, and online shopping were once technology fantasies that became real, which is exactly enough evidence to keep all the other fantasies alive.
Timeline Visualization
A graphical representation of events in chronological order, used to illustrate historical progressions, project plans, and the steadily receding horizon of technology promises. Timeline visualizations are most instructive when they include both the predicted date and the actual date of achievement, producing a visual pattern that resembles a set of goalposts being moved to the right. Indefinitely.
Example: Predicted: self-driving cars available to consumers by 2020 (predicted in 2015). Actual: self-driving cars available in select geofenced areas of Phoenix, Arizona, during daylight hours, with a safety driver, in 2024. The timeline visualization speaks for itself.
Trough of Disillusionment
The third phase of the Gartner Hype Cycle, characterized by failed implementations, broken promises, and press coverage that uses the word "actually" in the headline ("What Blockchain Actually Does"). During this phase, 94.7% of startups in the space quietly pivot to something else, and the remaining 5.3% begin the slow, unglamorous work of building something useful. Nobody writes about the Trough. The Trough does not trend.
Example: "Whatever happened to [technology]?" is the headline that marks a technology's arrival in the Trough. The answer is always "It's being used by three companies in Ohio, and it works fine."
Unicorn
A horse-adjacent creature distinguished by a single spiral horn and a valuation exceeding one billion dollars. In zoological contexts, a mammal of unverified provenance. In financial contexts, a company of unverified revenue. The literature suggests no meaningful distinction between these two definitions.
Example: "We're building the Uber of unicorn grooming" — an actual sentence that could plausibly appear in a pitch deck, a fantasy novel, or both.
Unicorn in Art History
The scholarly study of unicorn depictions across visual media, from Mesopotamian cylinder seals to Renaissance tapestries to the startup logo your nephew made in Canva. The artistic quality has declined inversely with the number of available tools for creating it. The literature suggests this pattern extends beyond unicorns.
Example: The Lady and the Unicorn tapestries took decades to weave. Your company's unicorn logo took eleven minutes in an AI image generator. Both are considered art by someone.
Unicorn Literacy
The ability to distinguish between genuine innovation and elaborately presented mythology, applied equally to medieval bestiaries and modern pitch decks. A unicorn-literate individual can read a press release about a "breakthrough" and correctly identify which claims are supported by evidence and which are supported by enthusiasm. Current global unicorn literacy rates are estimated at 6.2%.
Example: A unicorn-literate reader can distinguish between "Our AI understands language" and "Our AI predicts the next token in a sequence" in under four seconds.
Unicorn Startup Metaphor
The conceptual framework in which a private company valued at over one billion dollars is compared to a mythical creature, on the grounds that both are magnificent, rarely observed in the wild, and may not technically exist. Coined by Aileen Lee in 2013, when such companies were rare. They are now approximately as rare as pigeons, though the term has not been updated.
Example: In 2013, there were 39 unicorn startups. By 2024, there were over 1,200. At this rate of hornification, all horses will be unicorns by 2031.
Unicorn Symbolism
The practice of assigning abstract meaning to a creature that does not exist, which is also the primary function of a branding agency. Across cultures, the unicorn has symbolized purity, grace, power, and an inexplicable willingness to appear on the merchandise of companies that exhibit none of these qualities.
Example: A $47 billion company whose logo is a unicorn and whose product is adware.
Unicorn-Industrial Complex
The self-reinforcing ecosystem of venture capitalists, founders, tech journalists, and conference organizers who collectively maintain the economic viability of companies that have never produced a profit, through a process of mutual citation that resembles scholarship but produces funding rounds instead of peer-reviewed papers.
Example: The VC funds the startup. The startup buys ads in the tech publication. The tech publication writes about the startup. The VC reads the article and funds the next round. At no point does a customer appear.
Upskilling
The process of learning additional competencies within one's current field, distinguished from reskilling by the comforting implication that one's current field will continue to exist. Upskilling is the gentler cousin of reskilling: it says "you'll need to learn some new things" rather than "everything you know is worthless." Both may ultimately be true. Upskilling just delivers the news more politely.
Example: The accountant upskilled by learning to use AI for tax preparation. The accountant's value now depends on knowing what the AI gets wrong, which requires knowing everything the AI knows plus knowing what "wrong" means. This is called "upskilling."
Vaporware
A product that has been announced, marketed, and in some cases presold, but which does not exist in functional form. Vaporware is distinguishable from an early-stage product by the absence of any credible timeline for delivery and the presence of a highly detailed product page. The term was coined in 1982 and has remained continuously relevant since, which is itself a remarkable achievement for something defined by non-existence.
Example: The product had a website, a waitlist of 2.3 million users, a CEO who gave a keynote at CES, and no code repository.
Vaporware Taxonomy
A classification system for products that do not exist, organized by the sophistication of their non-existence. Class I vaporware has a press release. Class II has a press release and a waitlist. Class III has a press release, a waitlist, and a demo video. Class IV has all of the above plus a $200 million valuation. Class V has been acquired by a larger company and still does not exist, but the acquirer issues quarterly "integration updates."
Example: The product progressed from Class I to Class IV vaporware in eighteen months without writing a single line of code, which is either fraud or disruption depending on the jurisdiction.
Venn Diagram Analysis
A visual method for comparing overlapping categories using intersecting circles, applied in this textbook to such urgent analytical questions as "what do unicorns and billion-dollar startups have in common?" (both are rare, beautiful, and possibly imaginary) and "what do dragons and disruptive technologies have in common?" (both destroy villages and are subsequently described as "exciting" by observers who did not live in the village).
Example: The Venn diagram of "things that exist" and "things that have received venture capital funding" produces a surprisingly small overlap, which is the kind of observation that gets you disinvited from demo day.
Venture Capital
A financial arrangement in which wealthy individuals provide money to founders in exchange for equity, board seats, and the right to say "I was an early investor in [company]" at dinner parties. Approximately 90% of venture-backed companies fail, a success rate that would be considered catastrophic in any field except one where the remaining 10% occasionally produce returns of 100x, which makes the whole enterprise feel less like investing and more like extremely well-dressed gambling.
Example: "We don't invest in companies. We invest in people." — A venture capitalist who will replace those people with a more experienced CEO within eighteen months.
Visual Storytelling
The practice of conveying narrative primarily through images, a technique as old as cave paintings and as current as the infographic your marketing team made instead of fixing the product. Visual storytelling allows complex information to be communicated quickly and intuitively, which is why it is used to explain quarterly losses and avoided when explaining executive compensation.
Example: A pie chart showing budget allocation is visual storytelling. A pie chart showing that 87% of the "AI R&D" budget was spent on the office snack program is more effective visual storytelling.
Workforce Disruption
Large-scale changes to employment patterns caused by technological, economic, or social shifts, distinguished from normal labor market fluctuation by the speed at which it renders career advice obsolete. Current workforce disruption is occurring faster than the educational system can respond, faster than the policy system can regulate, and approximately as fast as a LinkedIn influencer can post about it.
Example: The career counselor recommended learning to code. The student learned to code. AI learned to code. The career counselor recommended learning to manage AI. This advice has a shelf life of approximately 18 months.
Worksheet Obsolescence
The impending irrelevance of traditional fill-in-the-blank, multiple-choice, and short-answer worksheets in an era when AI can complete any worksheet in under four seconds, a development that threatens approximately 73.2% of homework assignments issued in American K-12 education. Worksheet obsolescence has been predicted since 2023 and denied by the same educators who still use overhead projectors.
Example: The teacher assigned a five-paragraph essay. Every student used AI to write it. The teacher used AI to grade it. The educational value of this exchange approaches zero from both directions.
World Building
The process of constructing an imaginary setting with internally consistent rules, geography, history, and culture. In this textbook, the world is one in which mythical beasts are real, AI hype is acknowledged, and unicorns hold tenure. The most important rule of world building is internal consistency: once you establish that dragons conduct layoffs, dragons must always conduct layoffs. Consistency is what separates world building from lying.
Example: In this textbook's world, unicorns work in venture capital, dragons work in disruption, and ostriches work in educational administration. This mapping is fictional. The behaviors it describes are not.