The AI boom has a power problem. Not a software problem, not a regulation problem, not even a chip shortage problem. A straightforward, physical, there-is-not-enough-electricity-in-the-ground problem. And this week, the price tag for solving it became official: American utility companies plan to spend $1.4 trillion upgrading the national grid over the next five years, an increase of more than 27% from projections made just a year ago. The primary driver is not aging infrastructure or climate resilience. It is data centres, and the AI running inside them.
The figure comes from a new report by PowerLines, a nonprofit consumer education organisation that analysed capital spending plans from 51 investor-owned utilities. Together those companies serve 250 million customers. A majority of them cited data centres as a top driver of capital expenditure in their most recent earnings reports. The message from the utility sector is unambiguous: AI is reshaping the physical energy infrastructure of the United States, and consumers are going to feel it in their electricity bills.
The Scale of What AI Actually Consumes
To understand why $1.4 trillion is not an overreaction, consider the demand figures. US data centres already consumed more than 4% of the country’s total electricity in 2023. That number has accelerated sharply since. By the end of 2026, data centre power demand is expected to approach 1,000 terawatt-hours annually, a figure that would represent a meaningful share of total US electricity generation.
The hyperscalers are not waiting for the grid to catch up. Meta, Microsoft, Amazon, and Alphabet are collectively projected to spend $700 billion on AI infrastructure build-outs in 2026 alone. The technology sector is now outspending the entire US utility industry on energy-adjacent infrastructure by a factor of two. A single large data centre campus can consume more power than 100,000 homes. When you build dozens of them simultaneously across the country, the grid feels it.
Who Pays for All of This
This is where the story stops being purely about technology and starts being about politics. Utility spending gets passed to consumers through rate hikes approved by state regulators. US electricity costs have already risen 42% since 2019, significantly outpacing inflation. Utilities requested $31 billion in rate hikes in 2025 alone. A separate PowerLines report found that 56 million Americans are facing higher utility bills as a result of rate increases regulators approved that year.
Goldman Sachs analysis published earlier this year warned that data centre-driven electricity demand will push core inflation up by 0.1% in both 2026 and 2027. That may sound small, but it lands on top of an inflation picture that has already strained household budgets for several consecutive years. The fundamental question being asked in regulatory hearings and policy circles is pointed: should ordinary households subsidise the electricity needs of companies worth trillions of dollars?
Tech companies are increasingly aware of the optics. Hyperscalers have begun adopting what the industry calls pay-for-your-own-power models, where data centre developers fund their own generation capacity rather than drawing from the shared grid. But not all of them are doing this, and the capacity that does use shared infrastructure is enough to move electricity prices at a national level.
Nuclear’s Unlikely Comeback
The power demand problem has quietly revived an industry that spent decades struggling for commercial relevance. Major tech companies are now putting serious capital behind next-generation nuclear projects as they look for reliable, carbon-free baseload power for data centres that run around the clock. These deals are giving nuclear startups both the funding and the credibility they have long lacked, at a moment when governments and utilities are scrambling to figure out how to support fast-rising demand without blowing past emissions targets.
The appeal is straightforward. Solar and wind are variable. Natural gas carries carbon and geopolitical exposure. Nuclear runs continuously, produces no direct emissions, and can be co-located near data centre campuses. The licensing timelines are still long, but the commercial urgency from Big Tech is accelerating the conversation with regulators in ways that years of climate advocacy could not.
Communities Are Starting to Push Back
Not everyone wants a data centre in their backyard. Business Insider reported this week that 12 state-level data centre moratorium bills were introduced in 2026, reflecting genuine local frustration with the land demands, water stress, noise, and electricity load that large facilities bring to communities. Eleven of those bills have stalled or been voted down, but the political pressure is building.
The tension is not going away. Everyone wants the economic upside of AI, but far fewer people want the physical consequences of the infrastructure that makes it run. For the tech companies building this generation of AI, winning is no longer just a matter of having the best models or the most capital. It requires navigating a grid that was not built for what they are asking of it, communities that are increasingly organised in opposition, and a regulatory environment that is only beginning to understand the scale of what it has been asked to accommodate.
The $1.4 trillion is just the utility side of the ledger. The real number, when you add the hyperscaler spending, the nuclear deals, the land acquisition, and the grid interconnection costs, runs into the multiple trillions. AI has always been described as a software revolution. It turns out it is also one of the largest infrastructure construction projects in American history.
