On March 31, Reuters put a hard number on a change the market has been slow to price correctly. S&P Global estimated that Microsoft, Amazon, Alphabet and Meta planned to spend about $320 billion on data centres, chips and related AI infrastructure in 2026, up from $230 billion the year before and just $40 billion in 2019. That alone would have been enough to keep the old AI narrative going: more capex, more GPUs, more cloud demand, more software leverage. But the more important line in the Reuters report was not the size of the cheque. It was the constraint attached to it. Data centres require vast amounts of electricity, and AI is becoming dependent not only on chip supply and corporate budgets but on power prices and infrastructure capacity. Ten days later, the International Energy Agency sharpened the point. Its April 10 report said electricity demand from data centres worldwide is set to more than double by 2030 to roughly 945 terawatt-hours, with US data centres on course to account for almost half of the growth in the country's electricity demand through the end of the decade. That reframes the AI race. The glamour remains in models, agents and benchmarks. The bottleneck is starting to sit in substations, transmission queues, cooling systems and county-level permits.
A $320 Billion Buildout Now Depends on Substations, Not Just GPUs

Reuters and the IEA point to the same mechanical shift: AI capacity now scales only when power capacity scales too.
For most of the past two years, investors treated AI infrastructure as a semiconductor story with a software upside. The logic was straightforward. If Nvidia could ship enough accelerators, and if Microsoft, Amazon, Alphabet and Meta could keep funding massive clusters, then model performance would rise, usage would expand, and the rest of the stack would follow. That model still matters, but it is no longer sufficient. A modern AI data centre is not merely a room full of servers. It is a power-intensive industrial site whose economics depend on land, transmission access, backup systems, transformers, cooling equipment and a utility willing to serve a suddenly enormous load.
The IEA's numbers make clear why this changes the investment case. A jump to 945 terawatt-hours by 2030 would put global data-centre electricity demand above the current annual consumption of many mid-sized industrial economies. In the US, where the hyperscalers are concentrating the largest AI clusters, the issue is less whether demand exists than whether the grid can absorb it on schedule. Utilities can plan generation additions. They cannot instantly create transmission lines, interconnection capacity or transformer inventories.
The BossBlog Daily
Essential insights on AI, Finance, and Tech. Delivered every morning. No noise.
Unsubscribe anytime. No spam.
Tools mentioned
AffiliateSelected partner tools related to this topic.
AI Copilot Suite
Content drafting, summarization, and workflow automation.
Try AI Copilot →
AI Model Monitoring
Track model quality, latency, and drift with alerts.
View Monitoring Tool →
Some links above are affiliate links. We earn a commission if you sign up through them, at no extra cost to you. Affiliate revenue does not influence editorial coverage. See methodology.