But where will all that power come from?
OpenAI announced this week a massive expansion plan in partnership with Nvidia to build AI datacentres that will consume up to 10 gigawatts of electricity, with another 17 gigawatts of projects already in development.
To put that into perspective, 17GW is roughly equal to the total energy consumption of Switzerland and Portugal combined.
The numbers are overwhelming, and for many observers they are a wake-up call.
Andrew Chien, a professor of computer science at the University of Chicago, called the announcement a “seminal moment”.
“I’ve been a computer scientist for 40 years, and for most of that time computing was the tiniest piece of our economy’s power use. Now, it’s becoming a large share of what the whole economy consumes,” he told Fortune.
OpenAI CEO Sam Altman, speaking at a recent event in Texas, made no apologies for the scale.
“This is what it takes to deliver AI,” he said, adding that ChatGPT usage has increased tenfold in just 18 months.
While the ambition is clear, so is the challenge: Where will all this energy come from?
The US power grid, built mostly in the mid-20th century, is already under immense strain.
See also: Power, politics and datacentres: An interview with Pure DC CEO Dawn Childs
The AI boom, along with the rise of cloud computing and cryptocurrency mining, has sent datacentre construction skyrocketing across the country. Utilities are now swamped with requests, scrambling to add infrastructure and navigate regulatory bottlenecks.
“A year and a half ago they were talking about 5GW. Now they’ve upped the ante to 10, 15, even 17,” Chien said. “There’s an ongoing escalation.”
The Texas grid, where one of Altman’s new facilities is located, typically handles around 80GW.
“So you’re talking about an amount of power that’s comparable to 20% of the whole Texas grid,” Chien said.
The US Department of Energy has already warned that without rapid investment in new infrastructure, blackouts could rise by more than 100 incidents annually by 2030.
The nuclear bet
So far, Altman has been consistent about one thing: nuclear power is the answer. He has invested in both fission and fusion startups, touting reactors as the only scalable, steady energy source that can match AI’s relentless demands.
But experts are sceptical about the timeline.
“As far as I know, the amount of nuclear power that could be brought on the grid before 2030 is less than a gigawatt,” said Chien.
Fengqi You, a professor of energy systems engineering at Cornell University who also researches AI, agreed: “A typical nuclear plant takes years to permit and build,” he said. “In the short term, they’ll have to rely on renewables, natural gas, and maybe retrofitting older plants.”
Environmental impact
While Altman and Nvidia are push toward what they frame as a compute-first economy, environmental experts urge caution. Cooling the massive racks of servers in these datacentres requires huge volumes of fresh water, a resource already strained in many areas.
If datacentres disturb biodiversity, that could lead to serious unintended consequences. There’s also the issue of electronic waste. As Nvidia continues its aggressive hardware cycle, older processors are discarded, contributing to toxic waste streams.
“They told us these datacentres were going to be clean and green,” Chien added.
“But in the face of AI growth, I don’t think they can be. Now is the time to hold their feet to the fire.”