Welcome to the hundreds of new subscribers who have joined over the past few weeks. In today’s free post, I look at who wins in AI. It’s not venture capitalists.
If you like what you read here and you’re not yet subscribed, consider subscribing. Most of my posts are free though deeper dives tend to be paid.
In 2011, a well-known venture capitalist declared that “software is eating the world.” It was a prescient take. Over the next decade, software abstracted and consumed logistics, finance, entertainment, retail, and labor. Every major business that mattered—Uber, Stripe, Netflix, Amazon—was, at root, a software firm stapled to a real-world workflow. Margins shifted to whoever owned the code.
But something has changed.
In AI, models eat software. And with that inversion, the physics of value creation shift again. The scarce resource is no longer clever code. It’s capital, compute, data, energy, and, above all, strategic positioning inside the AI-industrial stack.
Let’s break this down.
1. From Software Abstraction to Model Annexation
Today, thousands of startups are building software on top of large language models: wrappers, agents, orchestration layers. But most of these are structurally fragile. Their value rests on thin advantages: clever prompting, minor UX polish, slight vertical adaptation.
And the foundational model labs, including OpenAI, Anthropic, and DeepMind, have every incentive to consume them.
Once a wrapper demonstrates product-market fit and telemetry flows, the lab can distill its functionality and absorb it into the base model or runtime. The lab already runs the inference layer. Adding a new capability, such as tool use, file handling, or planning heuristics, incurs negligible marginal cost. For a third-party app, the same feature requires SaaS overhead, marketing, compliance, and constant differentiation.
The result: annexation. Wrapper startups become telemetry for the model’s next release.
This is not software eating the world. This is software being reabsorbed into the substrate that it was built on.
2. Model Economics Are Not Software Economics
The old thesis relied on software’s core economics:
Infinite replicability
Marginal cost ≈ 0
Defensibility via network effects
Winner-take-all via distribution
But modern frontier models break that mold.
Training a GPT-4–class system requires:
Hundreds of millions in compute
Restricted access to cutting-edge silicon
Gigawatts of power and advanced cooling
Proprietary or synthetic data loops
Legal insulation against misuse and hallucination
These are not traits of "software." They are industrial supply chain properties.
And once inference becomes cheap and ubiquitous, the economic bottlenecks migrate upstream: to power, hardware, and compliance. That’s where the leverage goes.
3. Software Doesn’t Eat Power Plants
“Software eats the world” worked because software was light. It floated on pre-existing infrastructure: cloud compute, broadband, smartphones. Its physics were virtual.
But foundation models do not ride infrastructure. They are infrastructure. And their performance depends directly on:
Power generation and delivery
High-performance real estate
Foundry access and HBM supply
Geographic policy, data localization, and export control
You cannot build GPT-5 in a garage. You can’t even run GPT-4 in most countries.
In this world, the winning position isn’t writing a better app. It’s owning the land next to the substation, the lease on the cooling system, the offtake agreement for the power, the partnership with the regulator.
AI is not software. AI is industrial cognition. And it is re-industrializing the world.
4. Commodification Is Real. But Not Where It Matters
Yes, base models will commoditize at the weights level. Once a model reaches a capability threshold and the weights leak or get open-sourced, the underlying artifact becomes replicable.
But the weights are not the product. The system is the product.
What matters is how models are:
Integrated into enterprise data flows
Grounded via retrieval
Audited for compliance
Co-located with compute and latency guarantees
Continuously fine-tuned via user telemetry
Inference alone is cheap. But the complete, dependable system, with SLAs, indemnity, latency constraints, audit trails, and integration, is not.
So yes, the weights may be free. But the control points aren't.
5. The End of Application Arbitrage
In the previous paradigm, value was captured by arbitraging between legacy sectors and agile software wrappers. Uber didn’t own cars. Airbnb didn’t own hotels. Value came from sitting in the middle, adding coordination and code.
In the AI era, that logic flips.
Now, value comes from owning the whole stack, or from deeply embedding into the domain: medical records, legal workflows, hardware control loops, real-time sensor networks, edge inference.
This is no longer a game of modular arbitrage. It’s a game of vertical entanglement.
Apps that live purely in the middle layer, abstracted from power, latency, compliance, and proprietary data, will be eaten. Not by competition. By the models themselves.
6. Who Wins in a World Where Models Eat Software?
Frontier labs win by releasing primitives and absorbing generic functionality.
Infra owners win by capturing value in power, land, silicon, and latency.
Sovereigns win by shaping policy, certification, and deployment rules.
Specialists win by embedding deeply in domain-specific, compliance-heavy verticals.
Startups survive only if they own proprietary data, unique distribution, or real-world integration.
The future isn’t one where the best UX wins. It’s one where the deepest integration with atoms, data, and law wins.
7. The Venture Model Was Built for a Different Game
If models eat software, and software is no longer the point of leverage, then we also have to ask: What happens to the capital stack that was built to back software?
The short answer: it’s misaligned.
Traditional venture capital, especially the kind that flourished in the SaaS/cloud era, was designed for a world of:
Low capex, short cycle times
Modular composability
Distribution-driven moats
Lightweight compliance
High-velocity iteration
That model doesn’t translate cleanly into a world where:
Training cycles take years and billions
Power and silicon access determine viability
Sovereign relationships affect deployment
Compliance gates determine enterprise viability
Data control, not feature velocity, creates defensibility
This is not a failure of venture capital. It’s a failure of fit.
VCs were optimized for rapid abstraction, not slow industrial layering. For software that floats, not cognition that anchors into the grid.
The terrain changed. The map must change too.
Closing Thought
Software ate the world because it was light.
Models eat software because they are heavy.
And what eats models?
Power. Land. Policy. And time.
The firms that succeed in the next decade will not be those that iterate the fastest, but those that lock in the deepest, most durable position in the substrate.
If software was the abstraction layer of the last industrial revolution, AI is the substrate of the next one.
And substrate isn’t something you rent.
It’s something you own.
Coda
If you enjoy this newsletter, consider sharing it with a colleague.
Most posts are public. Some are paywalled.
I’m always happy to receive comments, questions, and pushback. If you want to connect with me directly, you can:
The fallacy in this thinking is the assumption that LLMs will last. They are inherently flawed—hallucinations are insoluble by the nature of how LLMs work. Billions of dollars, hundreds of billions, trillions, are being set on fire before our very eyes.
So, the smart play is indeed the wrappers. Move fast, cash in before the substrates sink back into oblivion, hunker down safely with the profits until the eventual recession plays itself out.
AI is not revolutionary, it's counterrevolutionary. It's a mechanism by which the haves can crush the have nots.