AIxEnergy

The Five Convergences (Part II of VI): AI as Load-How AI Is Rewiring the Grid

Brandon N. Owens Season 1 Episode 3

Artificial Intelligence is no longer just a software challenge. It’s a physical one. In this episode, we explore a convergence most haven’t seen coming—until now.

Across the U.S., AI training and inference are triggering a historic surge in electricity demand, rivaling the rise of air conditioning in the 20th century. By 2030, AI data centers could consume over 9% of total U.S. electricity—an increase of 400–500 terawatt-hours. That’s like plugging in an extra California.

But AI doesn’t just use power. It reshapes it.

AI campuses run 24/7, don’t follow human behavior, and concentrate demand in tight geographies. The result? New “load islands,” rising grid congestion, regional imbalances, and a multi-billion-dollar race to rewire the energy system.

Brandon N. Owens—author of The Five Convergences and Artificial Intelligence and U.S. Electricity Demand: Trends and Outlook to 2040—break down what utilities, regulators, investors, and tech companies must understand about Convergence I: AI as Load.

🔌 Highlights from this Episode:

  • Where AI demand is hitting hardest: From Northern Virginia’s “Data Center Alley” to crypto-fueled megawatt spikes in Texas.
  • Why traditional grid planning is failing: IRPs are outdated, interconnection queues are jammed, and speculative siting is distorting the market.
  • What clean energy advocates need to know: AI could undermine decarbonization—or accelerate it—depending on how we act now.
  • How the electricity system is being gamed: Developers are squatting on transmission rights, driving up costs and delaying critical infrastructure.
  • What leading utilities are doing: Dominion is charging for reserved capacity. ERCOT is scrambling to keep up. The DOE and FERC are playing catch-up.

Subscribe now at AIxEnergy.io.

Support the show

Host: Welcome to AIxEnergy, the podcast where we explore the rising intersection of artificial intelligence and the systems that power our world. I'm your host, and today we begin our deep-dive series into The Five Convergences — a framework that maps how artificial intelligence, or A-I, is reshaping electric infrastructure from the inside out. This is episode two of six on the topic, and today we begin our first deep dive--this one into the concept of A-I as Load.

Our guest today is Brandon N. Owens — founder of AIxEnergy dot I-O and the author of not one, but two reports: The Five Convergences of A-I and Energy and Artificial Intelligence and United States Electricity Demand: Trends and Outlook to Two Thousand Forty. Together, these reports form the intellectual foundation for understanding A-I’s physical footprint on the American electric grid.

Brandon, thanks for joining us.

Brandon: Thanks for having me. It’s an important conversation, and I’m excited to unpack it with you.

Host: Let’s start with the first convergence: A-I as Load. On the surface, it sounds simple. A-I uses electricity. But your work shows that this isn’t just another category of demand—it’s an entirely new paradigm. What’s different about this?

Brandon: A-I as Load represents a fundamental shift in the grid's demand profile. Historically, electricity growth in the United States has been flat for nearly two decades. But now, driven by generative artificial intelligence and machine learning models, we’re seeing explosive demand—on the order of hundreds of terawatt-hours of new consumption by the end of the decade.

In our recent AIxEnergy report, we show that American data centers—most of them supporting A-I training and inference—already consume between seventeen and twenty gigawatts of steady power. By Two Thousand Thirty, they could require upwards of thirty-five to forty gigawatts and consume nearly nine percent of all U.S. electricity. That’s nearly equivalent to adding the entire state of California’s total demand to the grid in just a few years.

Host: And these are not intermittent or flexible loads either.

Brandon: Exactly. A-I campuses are twenty-four-seven, high-capacity, and incredibly dense. Unlike homes or commercial buildings, they don’t follow human activity cycles. They run full-throttle, around the clock. That changes the entire load curve—it flattens valleys, elevates off-peak baseload, and makes planning much harder.

Host: Let’s get specific. Where is this happening in the real world?

Brandon: Start with Virginia. Data centers now account for over twenty-five percent of Dominion Energy’s total electricity sales. In Northern Virginia’s “Data Center Alley,” summer peaks are being driven more by servers than air conditioners. Dominion is building new transmission lines and substations just to keep up.

Texas is another hotspot. The Electric Reliability Council of Texas, or E-R-C-O-T, has seen its planning models overwhelmed by sudden megawatt-scale load requests from A-I and cryptocurrency facilities. In one case, a substation event in West Texas triggered a seventeen-hundred megawatt load drop when data centers disconnected all at once.

Other key states include Georgia, Iowa, Oregon, and Ohio. Many of these areas offer relatively low-cost electricity, favorable siting policies, and legacy infrastructure from retired coal plants. That last point is crucial: companies are strategically targeting sites with existing interconnection rights and transmission capacity. It’s essentially a land-and-power grab.

Host: So A-I’s geographic footprint is deepening regional load imbalances.

Yes. The clustering of these facilities—what I call “computational load islands”—is creating sharp regional disparities. Some grids are already saturated. Others, particularly in the Midwest and the Pacific Northwest, are now marketing themselves as A-I-friendly because they still have headroom. This pattern will shape where power infrastructure gets built over the next decade.

Host: Let’s talk about planning. You’ve said that traditional grid models can’t keep up. What exactly is broken?

Brandon: Three things. First, the pace. Grid expansion is a five-to-ten-year process. A-I data centers are going from site selection to full operation in under two years. That’s a fundamental mismatch.

Second, forecasting. Most integrated resource plans didn’t anticipate this level of demand. Many utilities have had to revise their forecasts upward by double digits—just in the last year.

Third, interconnection. The current queue system is jammed with speculative requests. Some developers are reserving five hundred megawatts per site—power they may never actually use—just to hold their place in line. That’s clogging the system and crowding out shovel-ready clean energy projects.

Host: Are regulators responding fast enough?

Brandon: Slowly, but some steps are underway. Dominion Energy now requires new data centers to pay for a large portion of their reserved capacity whether they use it or not. That helps ensure grid upgrades are financially viable.

At the federal level, the Department of Energy—D-O-E—convened a task force in Two Thousand Twenty-Four to study A-I infrastructure needs. The Federal Energy Regulatory Commission, known as FERC, and the North American Electric Reliability Corporation, or NERC, are monitoring reliability concerns. But the broader problem remains: grid infrastructure moves at twentieth-century speed, while A-I moves at the speed of silicon.

Host: And this is all unfolding alongside a massive push for decarbonization.

Brandon: Exactly—and that’s the real tension. A-I is both a massive energy consumer and a potential climate solution. If we power this intelligence with fossil fuels, we accelerate emissions. But if we align A-I with clean generation—through twenty-four-seven power purchase agreements, dynamic load shifting, and carbon-aware dispatch—we can make A-I a driver of grid decarbonization.

Google is experimenting with shifting inference workloads to periods of high renewable output. Microsoft has committed to matching its A-I workloads with twenty-four-seven clean energy purchases. But right now, those are the exceptions. We need better incentives, clearer policies, and smarter pricing signals to make that the norm.

Host: So in the best-case scenario, A-I becomes a flexible grid asset—not just a fixed burden.

Brandon: Yes. Training workloads can be scheduled for off-peak hours. Inference can be geographically optimized to align with green power. With the right market mechanisms, A-I data centers could behave like demand-side batteries—soaking up excess power when it’s abundant, and backing off when the grid is stressed. That’s the vision we outline in the “Sustainable A-I” scenario in our report.

But that future doesn’t just happen. It requires coordination between utilities, cloud providers, regulators, and investors. It’s a systems-level challenge.

Host: Before we wrap—your team at AIxEnergy is developing a new state-level intelligence platform. Tell us what that’s about.

Brandon: We realized that policymakers and planners are flying blind. There is no unified national database that maps A-I infrastructure—data centers, electricity demand, interconnection queues, regulatory readiness—state by state. So we’re building it. We’re compiling verified data on data center siting, transmission congestion, power availability, carbon intensity, permitting timelines, and local policy.

The goal is simple: help utilities, regulators, investors, and developers understand where the next wave of A-I infrastructure can realistically go—and what the grid can handle. We’ll be launching it soon for subscribers at AIxEnergy dot I-O. If you work in energy, infrastructure, or artificial intelligence, this is going to be your daily dashboard.

Host: That sounds like a game-changer. Final thoughts?

Brandon: Just this: A-I is not a future risk. It’s a present force. And we need to catch up. Fast.

Host: Brandon Owens, founder of AIxEnergy, author of The Five Convergences and Artificial Intelligence and U.S. Electricity Demand—thank you for your insights and your leadership.

Thanks. In our next episode, we shift to Convergence Two—A-I as Controller. We’ll explore how synthetic cognition is already operating grid assets in real time. The grid is beginning to think.

Host: Subscribe wherever you get your podcasts. And visit AIxEnergy dot I-O for visuals, reports, and daily intelligence at the frontier of artificial intelligence and electricity.