The Grid is the Mind: The Missing Foundation of the AGI Dream
You can't have AGI without a robust power grid
This post is an extension of, and complement to, my earlier post about adequate power being the principal bottleneck to AGI.
The human brain, in all its organic elegance, runs on twenty watts—less than a dim bedside lamp. It is silent, self-contained, and learns from scraps. It models the world in milliseconds without demanding a single kilowatt-hour more than absolutely necessary.
This fact should haunt anyone who believes that artificial general intelligence is just around the corner.
Because unlike human cognition, which emerged through the brutal compression algorithm of natural selection, artificial intelligence is not lean. It does not emerge from scarcity. It gorges itself on compute, inhales gigawatts like oxygen, and demands not just algorithmic breakthroughs, but vast amounts of physical infrastructure.
And no one, least of all the techno-optimists forecasting AGI by 2027, is acting like that matters.
Take the widely circulated AI 2027 scenario. It forecasts, with unnerving confidence, that by the end of 2027 we will have agentic, self-improving AI systems conducting most of their own R&D, doubling their own capabilities every few weeks, and accelerating toward superintelligence. It lays out recursive training loops, geopolitical flashpoints, synthetic data pipelines, and government-aligned megaprojects. It speculates that OpenAI, fictionalized as OpenBrain, and its competitors, will operate clusters of 10 million GPUs or more, running continuously, updating daily. All of this is presented as an inevitability.
And yet, a key assumption hiding in plain sight under all this recursive elegance is that the physical world will simply comply. That gigawatt-scale compute will be delivered to these clusters like hot water from a faucet. That power, water, cooling, and permitting will somehow sync up with quarterly product cycles and exponential compute curves.
What this reveals is something deeper than logistical oversight. It’s an epistemological divide. AI accelerationists are trained to see the world as an optimization problem. A loss function to be minimized. A series of gradients to be ascended. In this frame, infrastructure is just another lever. A trivial obstacle. A constant you can wrap into your model. Need 10X the compute? Set up the training run. Need 100X? Add more GPUs. The rest will take care of itself.
But reality is not differentiable. You can’t backpropagate through a zoning board.
What AI 2027 barely mentions are transformers. Not the neural networks, but the actual physical transformers that step voltage up and down. Or the substations. Or the environmental reivews. Or the liquid cooling systems. Or the water rights.
The report casually predicts datacenter clusters with 10GW of draw, equivalent to New York City’s peak summer demand. But it does so without exploring what it takes to get that power. It imagines a fleet of global AGI clusters being brought online with the ease of deploying a new software stack.
But infrastructure doesn’t scale like software. It moves slowly and politically. You don’t get a 10GW buildout with a memo. You get it with years of permitting, land fights, lawsuits, transformer shortages, transmission bottlenecks, and grid interconnection studies.
This is the foundational blind spot of modern AI discourse. In much of Silicon Valley, there is a near-mystical belief that if a problem can be encoded as a computational task, it can be solved. But power infrastructure isn’t a computational problem. It’s an engineering problem embedded in regulatory constraints, social resistance, and physical scarcity. It’s not just about scaling laws. It’s about trenching cables through land owned by thirty different stakeholders.
The authors of AI 2027 may have predicted algorithmic recursion, but they forgot thermodynamic recursion. Every step in this imagined self-improving loop requires exponentially more power, cooling, and coordination. And that means exponentially more infrastructure. The assumption that we can scale AGI without scaling the grid is not just wrong. It’s physics denial.
Let’s run the math. An H100 GPU draws 700 watts. A cluster of 10 million? That’s 7 gigawatts. Add cooling, networking, redundancy: easily 10GW. That’s one cluster. The scenario posits multiple such clusters, across competing labs, governments, and militaries.
This isn’t just an AI scaling question. It’s a civilizational scaling question. Because power isn’t just a budget line. It’s location, continuity, reliability, redundancy, security, and supply chains. Power doesn’t run on PowerPoint. It runs on copper, steel, cement, and water. And all of those are bottlenecked.
Consider cooling. The datacenters of the AGI era won’t just be hot. They’ll be industrial-facility-hot. They’ll need liquid cooling loops, evaporative systems, and in many cases, access to municipal-scale water supplies. Water rights are already straining in states like Arizona and Utah. Good luck adding AGI megaclusters to that equation.
In the future imagined by AI 2027, multiple megaclusters would be constantly retraining themselves with new data, generating their own synthetic corpora, refining their own parameters, and pushing new versions in real time. But where will the power come from? Where will the water come from? Who will build the cooling towers? Will they be built in time? What happens when a heat dome hits the Southwest and the grid buckles under peak demand?
Or take transmission. In the United States, new high-voltage lines routinely take 7 to 12 years to complete. Small modular reactors (SMRs) might help, but they don’t exist yet in commercial form. Gas turbines can be installed faster, but they’re still subject to permitting fights and local opposition. Nuclear? Functionally dead on arrival. And that’s before you consider NIMBYism, environmental lawsuits, or the lack of skilled labor to build all this stuff.
So when AI 2027 predicts that Agent-2 or Agent-3 will require vast retraining cycles on synthetic data daily, or that AI R&D will happen in a recursive loop of self-optimization, it’s imagining a future that assumes away every physical constraint. It presumes uninterrupted exponential growth in compute, with no serious discussion of delivery.
But power delivery is the bottleneck. It’s neo-industrial. The sleek AGI labs of tomorrow will resemble Cold War missile silos more than startups. Hardened substations. Private power generation. Water rights. Cooling farms. Physical security. Even if the AI is “aligned,” the infrastructure will need to be militarized.
We should expect hardened campuses, co-located power plants, and fortified substations behind razor wire. They won't look like campuses. They'll look like launch silos. A future in which AGI runs on-site natural gas turbines or small nuclear reactors is not just plausible. It's inevitable if the models need that much juice.
AGI, if it comes, won't run in the cloud. It will run in concrete. And unless you're prepared to solve for grid hardening, buildout timelines, and utility-grade reliability, you're not forecasting. You're hallucinating.
The final irony is this: biological intelligence emerged under extreme scarcity. It is sparse, efficient, and metabolically cheap. The AGI we dream of is obese by comparison. It is a mind that only exists with endless provisioning. Minds that do not sleep must be fed forever.
Nature teaches parsimony. But AGI, as currently envisioned, demands gluttony. It is not a mind built in a cave with scraps. It is a mind built in a furnace with pipelines. And the furnace has to stay on.
So yes, interpretability, alignment, safety: these all matter.
But if you are serious about AGI, you need to be serious about energy, cooling, and steel. You need to be serious about permitting law, about transformer shortages, about the price of copper per ton and the waiting list for SMR approvals.
Because the mind—this alien mind we’re racing to build—isn’t just code.
It’s infrastructure.
It’s wattage.
It’s water.
It’s copper.
It’s the grid.
And until that grid is built, the dream is just a fever.
>>> Let’s run the math. An H100 GPU draws 700 watts. A cluster of 10 million? That’s 7 gigawatts. Add cooling, networking, redundancy: easily 10GW. That’s one cluster. The scenario posits multiple such clusters, across competing labs, governments, and militaries.
Is this a bad prediction though?
What does the US Energy Information Adminstration (EIA) outlook prediction say?
https://www.eia.gov/outlooks/aeo/data/browser/#/?id=9-AEO2025&cases=ref2025&sourcekey=0
Total Electric Power Sector Capacity (GW):
2024: 1194.4
2027: 1299.6
That's growth of 105 GW! Certainly seems like 1 or 2 10GW clusters can fit here, especially if most of the growth is occurring due to datacenters (see below).
Or how about International Energy Agency (IEA) predictions?
https://www.iea.org/reports/electricity-2025
"We expect US electricity demand to grow at an average annual rate of 2% over the 2025-2027 period."
That's not far off from EIA predictions.
Further, they note (page 36 from the pdf):
"The rapid expansion of data centres in the United States has positioned the sector as a major catalyst of electricity demand growth, which will have a substantial impact on the country’s energy landscape. A recent study commissioned by the U.S. Department of Energy highlights the strong growth in electricity consumption by data centres, which rose from about 60 TWh in 2014 to 176 TWh in 2023, and constituted more than 4% of the country's total electricity use. Their scenarios indicate this consumption could rise by an additional 150 TWh [17 GW] to 400 TWh [45 GW] by 2028, reaching about 325 TWh [37 GW] to 580 TWh [66 GW], and accounting for 6.7% to 12% of total US electricity demand. The growth trend is supported by the record pace of new data centre announcements. In the first half of 2024 alone, announced data centre projects were associated with nearly 24 GW of power capacity needs – triple the amount stated during the same period in 2023."
Catch that? Rise by 17-45 GW by 2028 due to datacenters alone!
The report continues:
"States where computing facilities are expanding rapidly are showing higher electricity demand growth rates above the national average. According to the Energy Information Administration (EIA), commercial electricity demand has grown the fastest in states hosting clusters of computing facilities. Between 2019 and 2023, the ten states with the most rapid demand growth added 42 TWh in total – a 10% increase over the four-year period. Electricity demand has increased the most in Virginia, which added 14 TWh, followed closely behind by Texas with
about 13 TWh."
"Northern Virginia has emerged as a major hub for data centres, with 94 new facilities and over 4 GW of capacity connected since 2019. Virginia is by far the state with the highest share of electricity demand coming from data centres at more than 25%. According to a report by a consulting firm, an anticipated 11 GW of additional data centres in Northern Virginia by 2030 would account for more than 40% of the state's current peak electricity demand."
If a single state could conceivably support AI-2027's 10GW prediction for the top-AI company, I wouldn't call that prediction a "blind spot".