Value accrues at the bottom of the AI stack
LLM providers build atop vast amounts of infrastructure and value flows down the stack
The current AI boom has created a gold rush of wrappers, plugins, and prompt-powered SaaS apps. Venture capitalists, predictably, have piled into the upper layers of the stack. These companies are building lightweight services on top of large language models. It’s a comfortable place for them: fast to build, easy to pitch, and hypothetically scalable. But this enthusiasm masks a deeper structural inversion. The real value is accumulating at the bottom of the stack, not the top.
At the very base lies the physical infrastructure required to train and serve frontier models. This includes data centers, power contracts, GPUs, networking, cooling systems, and land: an entire world of logistics and capital-intensive coordination that looks more like oil and gas than tech. It is slow, dirty, unglamorous, and brutally hard to replicate. That is precisely why it’s becoming the choke point for the entire AI economy.
Above that foundational layer sit the large models themselves: GPTx, Claude, Gemini, LLaMA. These models are not software products in the traditional sense. They are trained assets, amortized over billions in capex and thousands of hours of alignment work. They are the middle layer of the AI stack but increasingly the command layer, because whoever owns the model owns the interface, the protocols, and, eventually, the customer relationship.
And finally, atop all of this, sits the application layer: the wrappers, copilots, agents, and prompt-chains that VCs love to fund. But most of these are terminally fragile. They rely on APIs controlled by the model layer and lack defensibility. Their core differentiator is often UX polish or domain-specific prompt tuning. This is useful, but commodifiable. There are exceptions, of course. Companies that integrate deeply into vertical workflows, or that own proprietary datasets unavailable to the model providers might have durable and defensible moats. But these are the exception, not the rule. Most are building on sand.
This structure is now collapsing in on itself, as gravity reasserts its pull. OpenAI’s Stargate project is a clear signal that the real game is infrastructure. You don’t need to squint to see what’s happening: OpenAI is moving down the stack, claiming control of the GPU clusters, power lines, and silicon allocations that it previously rented from Microsoft. Why? Because every upstream dependency is now a point of strategic vulnerability. Power contracts matter more than product features.
This reversal mirrors a historical pattern. In the early internet, everyone scrambled to build websites. But it was the companies who owned the distribution rails who consolidated the long-term economic rents. Think: Amazon with logistics, Google with indexing, Facebook with identity. In the cloud era, the venture capitalists dumped money into SaaS companies, but the real profits accrued to Amazon (AWS) and Microsoft (Azure). And today, in the AI era, the same pattern is repeating. While the attention economy rushes to mint another note-taking app with a chatbot bolted on, the real empires are being forged in substations and GPU fabs.
Venture capital, for all its mythos, is structurally misaligned with this new reality. It wants velocity, not infrastructure. It seeks power laws, not power grids. Its models are tuned to chase apps, not assets. This is not a moral failure, but a constraint of fund mechanics: LPs want exits within a decade, not amortized returns over 30 years. And so VCs crowd into the top of the stack, even as value creation sinks to the bottom.
But here’s the twist: the stack is becoming indivisible. The next generation of AI giants will be full-stack companies. They will own the silicon, the model, and the interface. They will be vertically integrated in a way that hasn't been common in tech since the days of Bell Labs and IBM. The cloud unbundled computing; AI will re-bundle it. And the winners will not be the ones who built the best wrapper, but the ones who owned the ground it ran on.