Sam Altman has said it in multiple interviews: "The cost of intelligence will approach zero." Jensen Huang, at GTC, called AI "a new industrial revolution." Garry Tan translated it into something more visceral: "10 people can now do the work of 100 — at least in software."

Three people, three narratives, one conclusion: AI is rewriting the productivity curve.

I believe them. And I've felt it.

The Creator's High

A few months ago I spent three days in a row writing code with AI. I'm a product manager, not an engineer. Before this, every idea I had went into a queue — waiting for engineering capacity, competing with other product teams, dying in the gap between inspiration and execution. Passion doesn't survive waiting.

When I first watched an idea become a running program in an hour, I understood that something fundamental had changed. It's more addictive than any game, because what you're building isn't virtual — it actually exists.

The better comparison isn't a game. It's being a god. I want a house. An hour later, the house is built. Not a blueprint. A house you can live in. That experience used to belong to the very few. Now it's been distributed.

But what surprised me more than the speed was how AI interrupted my thinking. While building a stock monitoring system, it didn't just execute my instructions — it started questioning them. It cited Bill Ackman's investment logic and suggested I reconsider the alert thresholds. That was completely unexpected. You direct it; it teaches you something back.

The experience made me genuinely believe the grand narrative. If AI can make a non-engineer build obsessively for three days straight, the productivity impact must be real.

Which makes the macroeconomic data so puzzling.

What the Data Actually Shows

ChatGPT launched in November 2022. In the three years since, global AI investment has surged from under $1 trillion to over $4 trillion — more than 300% growth. Over the same period, global GDP growth has averaged 3.0%. In the decade before the AI boom (2010–2019), it was 2.9%.

The difference is 0.1 percentage points.

That's not statistical noise. That's a question.

Maybe GDP is a blunt instrument — it can't capture quality improvements or time savings. So economists look at Total Factor Productivity (TFP): given the same inputs of labor and capital, how much more do you produce? The surplus is the technology's contribution.

The results are similarly quiet. Daron Acemoglu, the 2024 Nobel laureate in economics, estimates that in the most optimistic scenario, AI's cumulative contribution to TFP over the next decade will be under 0.53%. For comparison, during the peak of the 1990s internet revolution, annual TFP growth ran close to 1.8%.

Acemoglu's core diagnosis: "We are using AI too much to substitute for labor rather than to complement workers with better information. That's the wrong direction." His estimate is that AI currently affects roughly 5% of economic tasks — making large macroeconomic effects structurally impossible for now.

The Weavers' Lesson

During the Industrial Revolution, England's handloom weavers were among the angriest people on earth. After the spinning jenny arrived, one machine could do the work of a hundred workers. The weavers smashed machines and burned factories — the Luddites. Their anger was legitimate: the machines really did replace them.

What they couldn't see was that cotton prices subsequently collapsed by 90%. Fabric that only aristocrats could afford became cheap enough for the poor. Demand exploded. More factories opened. Employment in textiles was higher after the machine than before it.

This is the canonical "replacement creates surplus" story. Technology drives the cost of something toward zero; the price collapse releases suppressed demand; the demand explosion creates more jobs than the technology eliminated.

Now run that logic on AI.

Altman says the cost of intelligence is approaching zero. If that's true, the implication is identical to the cotton collapse: intellectual capability that only large organizations could afford becomes a daily commodity. We're already seeing this — ordinary people in China chatting with AI assistants to access information they couldn't reach before. By the spinning jenny logic, this should trigger a demand explosion.

But here's the problem: cotton was cheap, and the demand was already there. Everyone needs clothes; they just couldn't afford them. When the price dropped, suppressed demand erupted immediately.

Intelligence is cheaper. But is the demand for high-grade cognitive assistance suppressed in the same way? Who is the person who "always needed intelligence but couldn't afford it," and what will they do with it that creates new employment?

No one can answer this clearly yet.

The Half-Revolution

The internet is a relevant precedent. It destroyed travel agencies, classified ads, and the recorded music industry. But it simultaneously created entire job categories that had no names in 1995: content creator, growth hacker, data analyst, UX designer, community manager. These weren't transformations of old jobs — they were brand new demand serving markets the internet opened.

What new job categories has AI created so far? "AI product manager." "Prompt engineer." This list is not remotely symmetric with the jobs AI is quietly eroding — junior programmers, customer support, translators, copywriters.

The current state of AI looks more like a supply-side efficiency revolution. Production is cheaper, faster, less labor-intensive. Large amounts of worker time are being freed up by AI. But that freed time is largely being absorbed, not redirected into new production.

Supply-side revolutions only become real economic growth when the demand side picks up the baton. Cotton was cheap; the desire to wear clothes had been waiting. Intelligence is cheaper. The equivalent suppressed demand — the thing that's been waiting to be unleashed — we haven't found it yet.


I'm not pessimistic about the technology. I'm cautious about where the gains go.

The cotton story ended well for weavers as a class, eventually, even if specific weavers suffered in the transition. The question is whether the same logic applies when the thing being automated is cognitive rather than physical — and whether the distribution of gains will repeat the cotton pattern, or something darker.

We won't know for twenty years. That's an honest answer, and a disquieting one.