When AI Flips the Switch on Texas
ERCOT’s latest risk models show power no longer follows weather or gas. It follows GPUs. Until the grid starts pushing back.
Welcome to the latest edition of Buy the Rumor; Sell the News. In today’s post, I take a look at how Texas’ electrical grid operator, ERCOT, manages expected demand loads, and how AI data centers affect those forecasts.
If you like what you read here, consider subscribing, and if you’re already subscribed, consider upgrading to a paid subscription. If you want to connect with me directly, my contact information is at the end of this post.
At first glance, ERCOT’s latest Monthly Outlook for Resource Adequacy (MORA) is the sort of document only a reliability engineer could love. Pages of hourly probability distributions, deterministic scenario tables, and obscure footnotes on CAFOR1 thresholds. But look closer, and it’s a fascinating mirror of our industrial priorities. It’s a probabilistic confession that the single biggest emerging risk to Texas’ power system is neither weather nor mechanical failure, but the erratic onboarding schedule of AI data centers.
Put plainly: whether Texans lose power this summer is increasingly decided not by heatwaves or wind deficits, but by when hyperscale compute operators decide to energize their next 1,000 megawatts of GPUs. That’s a striking and underappreciated inversion of who now sets the risk envelope for an entire state’s grid.
Why these ERCOT reports matter
Before diving into the AI angle, it’s worth grounding ourselves. ERCOT’s MORA report is not merely a bureaucratic ritual. It’s an attempt to forecast, using both deterministic scenarios and Monte Carlo-style probabilistic simulations, the chances that Texas will face a grid shortfall serious enough to require an Energy Emergency Alert (EEA), or worse, rolling blackouts.
These aren’t idle speculations. In February 2021, Texas suffered catastrophic outages during Winter Storm Uri, which ultimately killed hundreds. Since then, the public has paid closer attention to ERCOT’s reserve margins. And politicians have started to scrutinize any signs that reliability could again spiral. The MORA is ERCOT’s early-warning dashboard, telling market participants and policymakers how close the grid might come to the edge every month.
This summer’s surprise: blackout risks deflated by delayed AI data centers
ERCOT’s August 2025 MORA offers an unusually explicit illustration of how hyperscale compute is bending the entire grid’s risk posture. In the July outlook (prepared two months ago), ERCOT’s models anticipated that the grid would see roughly 1,727 MW of new large loads energized by the August evening peaks, with a long-tail scenario (99th percentile) soaring to 3,849 MW.
What are these “large loads”? Mostly, they’re data centers, many tied directly to AI training and inference. Each hyperscale site can draw hundreds of megawatts continuously, equivalent to the power needs of small cities. For perspective, 1,700 MW is roughly what 1.7 million Texas homes might draw on a sweltering afternoon with their ACs blasting.
But something happened on the way to August. These hyperscale loads were delayed, by construction slippage, hedging negotiations, or slower-than-expected GPU deployments. The new August report now forecasts only 537 MW of new large loads. The extreme tail scenario also collapsed to that same 537 MW, down from nearly 4,000 MW.
These AI data center delays dropped ERCOT’s EEA risk to just 0.48%. For ERCOT, anything below 10% is considered comfortably low.
In other words, the primary reason Texas will not sweat through high blackout probabilities this August is because large AI compute facilities quietly deferred energizing their loads by a few months. That’s not a weather story. It’s an AI industrial schedule story.
Why renewable energy doesn’t bail us out at 9PM
Surely renewables will smooth this supply volatility, right? If you’re tempted to dismiss this as just another blip that will be smoothed out by renewables, it’s worth looking more carefully at ERCOT’s power generation stack.
Texas now boasts over 50 GW of installed wind and solar, vastly outstripping its 13 GW of coal and 5 GW of nuclear. But installed capacity is a marketing figure; actual delivered power depends on the hour and weather. At 9PM, solar is effectively zero. And while ERCOT’s deterministic scenario for the critical evening hour does expect around 14 GW of wind, that’s still heavily probabilistic. The model’s low-wind sensitivity runs thousands of scenarios that show how quickly EEA risks spike if Panhandle and Gulf Coast winds underperform.

Meanwhile, dispatchable resources, which are those that can be adjusted as demand requires, shoulder the real balancing act. These include gas, coal, nuclear, and increasingly, batteries. During the 9PM risk hour, 79% of available capacity comes from dispatchable sources. Batteries are growing fast, but most installations are four-hour systems. This is helpful for peaks, but it’s not enough to ride out extended low-wind spells that can last 12 hours or more.
The upshot: in the hour when Texas risks rolling outages, the state is still overwhelmingly dependent on dispatchable plants to meet load. AI data ceners, which demand firm, around-the-clock power, have effectively hard-wired this dependency even deeper.
Why AI demand is uniquely non-negotiable
Some might argue that large industrial users have always stressed power grids. That’s only partly true, because most legacy industries are at least somewhat elastic.
Aluminum smelters curtail operations during sustained high prices. Petrochemical plants scale expansions to fit decade-long cycles. Even crypto miners, despite their chest-thumping about being “flexible load,” routinely unplug when day-ahead prices spike, simply because their economics break.
But AI? That’s a different beast.
A large-scale LLM training run on 5,000 H100 racks can run continuously for weeks. Interrupt it because ERCOT day-ahead pricing jumps from $50 to $800/MWh, and you risk having to restart the training from scratch. Unlike bitcoin mining, you can’t resume training from where you left off. Meanwhile, inference workloads are increasingly tied to latency-sensitive consumer products. Your chatbots, copilots, and real-time video synthesis tools all demand instant, continuous compute, which means continuous power.
This is why ERCOT’s probabilistic reserve risk model (PRRM) has now baked large loads directly into its tail scenarios.
The quiet emergence of industrial strategy
Pull back further, and you see how this is rewriting the implicit industrial policy of Texas, and by extension, America’s AI frontier.
Texas has long prided itself on a lightly regulated grid, with ERCOT operating largely outside FERC’s direct oversight. That energy island structure encouraged an explosion of renewables, gas plants, and flexible market experimentation. It also made the state a magnet for hyperscale compute. No surprise that every major AI operator, from OpenAI’s partners to Meta to Amazon, is aggressively building in Texas.
But the MORA inadvertently reveals how power is now flowing in the other direction. ERCOT’s reserve risk model is affected by the cadence of AI data centers coming online. The August report practically says it outright: the drop in blackout probability is because fewer data centers will come online.
This means the AI buildout has effectively become the de facto grid master. The probabilistic Excel models in ERCOT’s reliability offices are now forecasting how much breathing room millions of Texans will have, based on how quickly GPU clusters are scheduled to flip on. That’s a quiet, profound inversion of who controls whom.
But the grid may yet subordinate AI
Here’s the sharper turn almost no one in Silicon Valley seems to acknowledge.
These reliability reports imply that we are rapidly approaching a point where the grid starts to subordinate AI. As more hyperscale loads queue up for interconnects, ERCOT may begin denying approvals, imposing caps, or enforcing contractual interruptibility clauses to keep overall blackout probabilities tolerable.
It’s already starting. Some developers are being forced to accept demand modulation provisions. Others have to co-site their own gas turbines or massive on-site battery banks to get ERCOT sign-off. And once the next margins data center tips the EEA probability curve too far, the regulator will simply say no.
At that point, it’s not the GPU supply chain throttling AI’s growth. It’s not venture capital constraints or lack of clever model architectures. It’s the reserve adequacy model inside an ERCOT Excel sheet quietly dictating the pace of AI’s next frontier.
Coda
If you enjoy this newsletter, consider sharing it with a colleague.
Most posts are public. Some are paywalled.
I’m always happy to receive comments, questions, and pushback. If you want to connect with me directly, you can:
follow me on Twitter,
connect with me on LinkedIn, or
send an email to dave [at] davefriedman dot co. (Not .com!)
A measure used to assess the likelihood of having sufficient operating reserves on the grid, particularly during periods of high demand, such as during peak seasons or extreme weather events.
Wow!
Why is this?: “Interrupt it because ERCOT day-ahead pricing jumps from $50 to $800/MWh, and you risk having to restart the training from scratch. Unlike bitcoin mining, you can’t resume training from where you left off. “. Is there really no way to save state?
And are any of these AI data centers pursuing nuclear? I remember there being a lot of buzz about that a few months ago but lost track of the story.
Very interesting article !
Incidentally I wrote this on the related topic of AI’s Energy Appetite.
https://open.substack.com/pub/pramodhmallipatna/p/ais-energy-appetite-the-hidden-costs