Hours after his inauguration, Donald Trump declared a “national energy emergency.” The US, he warned in an executive order, suffered from “a precariously inadequate and intermittent energy supply, and an increasingly unreliable grid.” If it failed to expand its energy infrastructure and increase supply, Trump warned, “this situation will dramatically deteriorate in the near future due to a high demand for energy and natural resources to power the next generation of technology.”

Trump was characteristically quick to blame high energy prices on “the policies of the previous administration” and “our Nation’s diminished capacity to insulate itself from hostile foreign actors.” The country was not yet facing anything that could be called an energy emergency (although Trump’s own policies might well lead to one). But there were indeed steep challenges facing the power sector. After remaining largely flat for the last two decades, demand for electricity in the US has begun to surge. In 2024 electric utilities predicted they would need almost twice as much additional power by 2028 as they thought they would in 2023. Major power companies in some parts of the country are already struggling to avoid blackouts, especially during extreme weather events. The demand is only going to increase: after all, as many of the world’s nations agreed at the COP28 summit, we need to roughly triple our renewable capacity to keep the Paris Agreement target of 1.5 degrees of warming within reach, not just generating electricity from sources other than dirty fossil fuels but also using clean energy to power all transit and to heat and cool all buildings.

This surging demand has a range of sources, notably the spread of electric vehicles and the electrification of buildings and industry. But a significant contribution to the strain on the grid comes from the energy-hungry data centers that power artificial intelligence. Meta, for instance, is currently developing a facility in Louisiana that Mark Zuckerberg has promised to expand into a data center “supercluster” that will use almost twice as much energy as the entire city of New Orleans. Meanwhile, data centers in Virginia—home to Data Center Alley, which has one of the densest concentrations of such facilities in the world—consume more than a quarter of the electricity generated in the state. Researchers estimate that the diesel-powered generators used as backups there could already be causing 14,000 cases of asthma symptoms and imposing public health costs of $220 to $300 million per year. And in Memphis, Elon Musk’s artificial intelligence company, xAI, is powered by thirty-five methane gas turbines that belch smog-forming pollution.

In recent years AI’s environmental impacts have come under close scrutiny. Public reports about the enormous quantities of energy it uses and the prodigious amount of water necessary to cool the equipment in its data centers began to appear after the launch of the ChatGPT chatbot in 2022, and state-level efforts to mandate reporting of AI’s environmental impact followed quickly thereafter. So far they have not managed to show many results. The Biden administration issued an executive order on the “Safe, Secure and Trustworthy Development and Use of Artificial Intelligence” in 2023, but it avoided any comment about the technology’s voracious energy use, focusing instead on promoting competition within the AI industry and preventing AI-enabled threats to civil liberties and national security. Early last year Senator Edward Markey and others introduced a bill that required the federal government to assess AI’s environmental footprint and create a system through which companies could volunteer to report future impacts. It died in Congress without a vote, although the European Union did adopt a stronger version of the measure, which went into effect in August 2024.

On the first day of his second term Trump jettisoned Biden’s modest executive order. The next day he announced Project Stargate, an initiative through which several major technology giants will invest as much as $500 billion—more than the budget for the Apollo space program—in artificial intelligence and data centers across the US over the next four years. On Tuesday OpenAI announced that it would partner with the firms SoftBank and Oracle—with which it is already building a massive data center in Texas—to construct five new centers in Texas, Ohio, New Mexico, and an as-yet-unannounced state in the Midwest. 

Some analysts have predicted that artificial intelligence will help reduce humanity’s environmental footprint by allowing users to make energy-saving choices and by optimizing the technologies on which much of society now depends. Many major tech companies have net-zero goals, and some have made modest gestures toward using AI for the benefit of the planet: in 2017 Microsoft launched a $50-million initiative called “AI for Earth” that promises to support researchers and environmental groups with grants for Microsoft AI tools.

Now that they appear likely to fall short of those net-zero targets, given their current level of emissions and increasing power needs, several of these firms have begun an effort to resurrect the US nuclear industry as a carbon-neutral source of energy for AI. In October 2024 Google signed a contract with Kairos Power, a “next-generation” nuclear company that has promised to build its first small modular reactor by 2030; Microsoft has signed a twenty-year power purchase agreement with Constellation Energy, which will reopen the site of the US’s worst nuclear disaster; Amazon is investing directly in the Maryland-based company X-energy, which is also promising to deliver small modular reactors. The White House has been a strong supporter of nuclear-powered AI. This May, President Trump signed four executive orders aimed at accelerating the construction of nuclear power in the US.

The problem with these promises, according to M. V. Ramana, author of Nuclear is Not the Solution, is that the reactor designs used by Kairos and X-energy draw on models from the mid-twentieth century that have faced failures, unplanned shutdowns, and fundamental technological hurdles when deployed in the past. Nuclear power only generated 9.1 percent of global electricity in 2023, down by nearly half from its peak at 17.5 percent in 1996. The United States has seen only three nuclear reactors come online since 1996. Critics such as Ramana point to this history to suggest that nuclear power is both slow to construct and expensive, particularly when compared to solar and wind power. Trump’s executive orders indicate that he believes a quick scale-up of the nuclear industry will only be achieved by expediting and deregulating processes in the Nuclear Regulatory Commission and reducing the agency’s independence. This underlines a third concern about the industry: the danger of a catastrophic system failure.

If these promises of a new nuclear age do turn out to be little more than greenwashing, and if the Trump administration’s efforts to slow the buildout of renewable energy succeed even in part, then AI will likely end up getting a significant share of its power from more fossil fuels. Indeed, the promises of future AI growth are already spurring plans for the construction of more fossil-fuel plants—even in the absence of hard proof that the most extravagant of the forecasts will come true. A new report commissioned by the Southern Environmental Law Center (SELC) suggests that it will be almost impossible to build the number of data centers anticipated by energy demand projections, because doing so would require more than 90 percent of the global chip supply over the next five years. This is hardly surprising: demand tends to be overstated because companies put in requests to build more projects than they intend to complete, knowing many won’t be approved. And yet these forecasts, particularly from the Southeastern US, “are driving a dramatic and unnecessary overbuild of infrastructure,” as SELC Senior Attorney Megan Gibson has argued; that overbuild, in turn, “threatens to lock in fossil fuels, hike energy bills, and crowd out more reliable, cost-effective clean energy.” Daniel Brookshire, an analyst with the organization, reports that monopoly utilities in the South are planning to rapidly build 43,000 megawatts of methane gas plants within the next fifteen years—the equivalent, he notes, of forty nuclear power plants.

It is not easy to understand how much power AI really consumes. Estimates of the number of data centers worldwide run from 9,000 to 11,000, but many additional ones are currently being built. The International Energy Agency projected last year that data centers’ annual electricity consumption could double between 2022 and 2026, to 1,000 terawatt-hours–roughly equivalent to that of the entirety of Japan. But that estimate includes not just AI but all data center activities, from storing people’s emails to serving up Netflix videos and “mining” Bitcoins.

One way to get a rough sense of the effect of the AI boom is to look at the history of data center energy consumption in recent years. In the US, the amount of electricity going to data centers remained quite flat from 2005 to 2017, despite the growth of cloud-based online services such as Facebook and Netflix, as well as the cryptocurrency market in those years. (Cryptocurrency consumes a great deal of energy: crypto mining represents between 0.6 and 2.3 percent of all US electricity demand, the Energy Information Administration reported last year. But miners can work when energy prices are low, which goes some way toward diminishing their impact on the grid.) In 2017 energy-intensive hardware designed for AI began to be installed in data centers, and by 2023 their electricity consumption had doubled. The latest reports show that data centers consume 4.4 percent of all the electricity in the US. Recent projections from the Lawrence Berkeley National Laboratory estimate that more than half of the electricity going to data centers will be used for AI by 2028, at which point AI might use as much electricity per year as 22 percent of all US households. But such projections are sophisticated guesswork.

Exploring what exactly artificial intelligence is and how it works sheds some light on why it’s hard to track how much energy it uses to do its computational work. Before you ask an AI model to draft a research paper (an increasingly common phenomenon that has led some to conclude that the college essay is dead), write you a recipe, or generate a cool video, it has to be trained in a data center. These hangar-like facilities, owned and operated by Big Tech companies but also by less well-known developers like Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse, are filled with servers containing special chips called graphics processing units (GPUs) and the processors (CPUs) that serve them information. As the journalists James O’Donnell and Casey Crownhart describe in a major recent report for the MIT Technology Review, one AI model may be housed on over a dozen GPUs, and a large data center is likely to contain over 10,000 wired-together chips, as well as the fans and pumped water that siphon off all the heat generated by these processors.

To say that an AI model must be “trained” essentially means that it must digest a huge amount of data and perform a huge number of practice computations. This is an expensive, energy-intensive process: training OpenAI’s GPT-4, according to O’Donnell and Crownhart, cost over $100 million and required fifty gigawatt-hours of electricity, which is how much San Francisco uses in three days. Only after months of such training can people “inference” the AI models to get answers to their prompts, which requires a different quantity of energy, in part since the process is repeated many times—for example, in 100 million self-driving cars. O’Donnell and Crownhart note that 80 to 90 percent of total AI computing power is now estimated to go to inference, rather than training.



Ted Shaffrey/AP Photo

The Susquehanna nuclear power plant, next to which Amazon is building a data center, Berwick, Pennsylvania, January 14, 2025

As O’Donnell and Crownhart point out, the electricity used (and carbon emissions generated) by individual AI queries varies considerably based on the nature of the query, the type and size of the AI model that answers it, the time of day it is made, and the energy mix of the electric grid that powers the data center to which it is sent. O’Donnell and Crownhart worked with a team at the University of Michigan to generate examples derived from Meta’s Llama, a popular open-source AI model. They calculated that small models use much less energy than large ones, which tend to give better answers but have to run on more chips: the average text query on a small model consumed only the amount of energy it takes to run a microwave for one-tenth of a second, while a larger model required the equivalent of eight seconds. It also takes more energy for a model to respond to a complex prompt—for example, to write a creative story—than to a simple one. Queries to generate images, which use different kinds of models, rather surprisingly use less energy than queries to large text-generating models. But making a video uses much more: Sasha Luccioni, an AI and climate researcher, calculated that generating a five-second video using CogVideoX, an open-source model, used the same amount of energy as running a microwave for over an hour. If millions of people start making their own videos, AI’s appetite for energy will become truly gargantuan.

Yet as useful as these calculations are, we cannot be sure that they accurately gauge the energy consumption of many common AI models. This is because most Big Tech companies use closed-source AI—meaning that the code behind them is not available to the public—and do not release sufficient information about their energy and resource use. The Trump administration’s refusal to mandate reporting of AI’s environmental impact means that we have scant idea about what AI is really doing to the planet. As Luccioni puts it, “We should stop trying to reverse-engineer numbers based on hearsay and put more pressure on these companies to actually share the real ones.” Luccioni has created the AI Energy Score, which rates AI models on their energy efficiency, but Big Tech companies must opt in. To date, far from enough have done so.

Offering an important glimpse behind this wall of corporate secrecy, last month Google released a report that quantifies how much energy its Gemini AI uses with every prompt. Although the quantity per median text query (0.24 watt-hours, enough to run a microwave for a second) may seem small, multiply that amount by what could well be a mammoth number of queries made to Gemini every day and a worrying picture of AI’s energy and water consumption begins to emerge. The report also bears out, as O’Donnell and Crownhart’s research suggested, that complex prompts use much more energy. It’s also worth noting that the Google report only discusses text prompts, leaving out the much higher energy costs of video generation. Beyond that, Google still hasn’t revealed how many Gemini queries it processes every day, meaning that it’s still impossible to calculate the AI’s total energy footprint.

Another reason that it’s hard to accurately assess energy projections for AI is that Big Tech is rushing to integrate it into virtually every aspect of everyday life. Some of these AI applications are user-facing, like Google’s recently announced “AI-powered personal health coach,” or Spotify’s uncanny ability to predict what song you want to listen to next. Others are hidden from consumers—including, for example, the efforts of credit card companies to spot fraud and the work of technology corporations like Siemens to predict equipment failures before they happen. Researchers are currently building autonomous AI “agents” that will perform tasks for us with far less supervision than the current models require, solving multistep problems based on a single prompt. These agents may spend hours researching and writing complex reports for subscribing users, using enormous quantities of energy in the process. This reality isn’t far off: OpenAI is reportedly planning to offer the services of autonomous AI agents for $20,000 per month. All of this makes the technology’s future environmental impact almost impossible to predict. As O’Donnell and Crownhart put it, “every researcher we spoke to said that we cannot understand the energy demands of this future by simply extrapolating from the energy used in AI queries today.”

AI’s environmental toll appears still greater when one considers the technology’s climate implications beyond the energy consumption of its rapidly proliferating data centers. The political economist Benedetta Brevini, in her 2021 book Is AI Good for the Planet?, notes that firms are selling AI services not just to corporations that want to assess reams of data, such as investment banks and hedge funds, but also to fossil fuel corporations, advertising the new technology’s potential to optimize the pace and productivity of extraction.1 Both Amazon and Microsoft offer AI tools to fossil fuel companies to help them identify new oil and gas reserves. This supports the fossil fuel industry without even the excuse long used to protect coal: that it creates jobs and benefits workers. The US’s crude output has been breaking records for the past few years, but jobs in the oil and gas sector have become increasingly scarce—having fallen to 380,000 last year from a high of 600,000 in 2014—as AI takes over identifying optimal drilling locations, monitoring pipelines, and predicting the rise and fall of demand.

Further environmental damage, as Brevini and the South African sociologist Michael Kwet both note, comes from the production of the devices on which AI runs. Despite the conveniently intangible associations of “cloud” computing, the GPUs that host AI models consist of a thin layer of semiconductor, usually silicon, onto which components made of various metals are layered. Typical metals employed include copper, aluminum, cobalt, and tungsten, all of which come with an environmental cost. Hundreds of tons of ore need to be dug up to yield a single ton of useable metal–a process that is doubly polluting, since extraction not only poisons the air and water but also requires large quantities of energy. Many of these metals are, moreover, mined in areas rife with human rights abuses and armed conflicts. According to an analysis of its 2023 Conflict Minerals Report, for example, Amazon could not rule out the possibility that some of its suppliers had sourced minerals from nine of ten African countries where human rights–violating militias finance themselves through mining.

The benefits of the technology are as unevenly distributed as their costs. Brevini notes that while consultancies emphasize AI’s “global benefits,” those benefits are concentrated in the US and Asia. According to a 2024 report by the World Intellectual Property Organization, between 2014 and 2023 China filed 70 percent of patents related to generative AI, followed by the US at 11 percent, Korea at 8 percent, and Japan at 6 percent.

While European nations are among those lagging behind on research and patents, the implications are especially serious for Global South countries: many of the latter are locked out of AI gains by their lack of infrastructure, Brevini argues, while their economies stand to be the worst impacted by AI’s uptake. Even as “rich tech giants in the Global North monopolize the means of computation and knowledge,” as Kwet puts it in his recent book Digital Degrowth, “the poor countries perform the menial labor, like digging in the dirt for metal, picking coffee beans, labeling data to train artificial intelligence models, or cleansing social media networks of disturbing content.”2

There are generally two proposals from the left for mitigating the abuses of Big Tech: antitrust legal measures that would supposedly reestablish “fair” and “competitive” capitalism by curbing the industry’s concentrated power and wealth; and human rights measures that would establish “ethical AI” by rectifying algorithmic bias and surveillance. Neither of these, as Kwet argues, would do much to curb AI’s energy costs. Breaking up Big Tech corporations using antitrust legislation, for instance, would do little more than replace a few megacompanies with myriad smaller competitors, all of which would still be vying to grow as much as possible.

In place of such proposals, Kwet advocates a global effort to dismantle Big Tech’s current near-monopoly on the means of computation. Internet users in both the Global North and South should, he argues, organize to boycott software and platforms created by corporations like Amazon, Meta, and Microsoft, and instead use free programs, whose licenses ensure that anyone can access and modify the software itself. Prized free of the ceaseless growth imperative that drives Big Tech, such “People’s Tech” could be used to help society decide more democratically and rationally how to use environmental resources–and even how to wind down unnecessary production and consumption. Kwet argues that People’s Tech should be an integral element in a broader “Digital Tech Deal” that aligns with a sweeping slate of environmental reforms, such as the Red Deal proposed by the Indigenous group The Red Nation. But this is, he admits, a longer-term and far more ambitious project.



Chip Somodevilla/Getty Images

Donald Trump addressing a summit on “winning the AI race” cohosted by the Hill & Valley Forum and the venture capital podcast All-In, Washington, D.C., July 23, 2025

It is just as hard to imagine challenging the material foundations of Big Tech’s hegemony. As Kwet acknowledges, producing hardware like semiconductors, some of which are now only an atom thick, requires significant concentrations of technical knowledge and capital. But that should not dissuade us, he argues, from working to socialize supply chains as well as digital intelligence and data as a part of what he calls “People’s Tech for People’s Power.” He points to examples like the Open Data movement, which advocates for privacy, security, transparency, and democratic decision-making over the collection, storage, and usage of data.

It could turn out that the current AI frenzy is merely a bubble, as Emily Bender and Alex Hanna argue in their recent book The AI Con. But given the Trump administration’s unbridled backing of the tech industry, it will take significant political organizing to pop it. There are some important reforms that we can fight for while we also strive to build the People’s Tech that Kwet calls for. We must first challenge the rhetoric about AI’s limitless need for more energy. AI can be made far more efficient if data centers are required to be even somewhat flexible about when they consume power, dialing back their use during the relatively rare periods when regional power grids experience peak stress (think heat waves or polar vortexes). A recent report from researchers at Duke University suggests that the US grid would already have enough capacity to power many new data centers with minimal expansion if tech companies were required to adopt these protocols. AI models could similarly be put on flexible training schedules so that they use energy at optimum moments.

Evidence suggests that none of these reforms will happen without government regulation. Activists in the US must revive and fight for the passage of the AI Environmental Impacts Act that failed to pass in 2024, which would have made the technology’s energy effects more transparent. But major tech companies should also be made to pay for expansions in the infrastructure necessary to produce the increased amounts of electricity they say they need. As M.V. Ramana argues, when tech firms pledge to support a nuclear renaissance in the US, their promised contributions are paltry in comparison to the actual cost of building the fleet of power plants they propose.

Making Big Tech pay for the power it consumes might encourage tech executives to align themselves against the current Republican effort to defund the renewable energy buildout that began with the Biden administration’s Inflation Reduction Act. Adding utility-scale solar and wind power to the grid is, after all, far cheaper and faster not simply than the nuclear boondoggle but also than adding new fossil gas power plants. States should be requiring a rapid and carefully planned buildout of such renewable capacity. The Build Public Renewables Act, passed in New York by the Public Power NY campaign in 2023, can serve as an example.

Such reforms are all the more necessary given that, as a recent report by Eliza Martin and Ari Peskoe of Harvard Law School documents, the nation’s for-profit utilities stand to boost their earnings by attracting influential tech corporations. Utilities such as Duke Energy proudly publicize their deals to generate clean electricity (including small modular nuclear reactors) for firms such as Amazon, Microsoft, and Google; they also regularly sign other contracts with data centers to which they provide electricity. The energy companies assure their state’s public utility commissions “that the deals for Big Tech isolate data center energy costs from other ratepayers’ bills and won’t increase consumers’ power prices,” Martin and Peskoe write. “But verifying this claim is all but impossible,” they observe, not least because “regulators frequently approve special contracts in short and conclusory orders,” and utility companies have a clear financial incentive to offer tech firms discounted energy fees and raise rates elsewhere to make up the difference. In other contexts, utilities have reportedly done just that: recent litigation against Duke, one of the country’s largest utilities, revealed that the company offered the city of Fayetteville, North Carolina a $325 million discount; the district court noted that an internal Duke document “disclosed a plan to shift the cost of the discount…back to its wholesale [and] retail customers in years to come.”

The nation’s century-old regulatory regime is based on the idea that the public benefits from new infrastructure like power plants and transmission lines. State regulators therefore almost always approve utilities’ requests for new infrastructure, the cost of which can be passed on to customers—whether that infrastructure is being built to accommodate the needs of a growing urban population or a single new corporate client. Martin and Peskoe argue that this approach risks lumping ordinary consumers together with institutional consumers of power, including data centers: “The very same rate structures that have socialized the costs of reliable power delivery are now forcing the public to pay for infrastructure designed to supply a handful of exceedingly wealthy corporations,” Martin and Peskoe write. We are, in essence, all being asked to shoulder the energy and environmental burden of the AI boom.

But, like other environmental burdens, the toll of the AI boom doesn’t fall equally. It cleaves to already deep lines of race and class injustice. Elon Musk’s xAI data center in Memphis, for example, was built near a predominantly Black community that environmental justice advocates say has been plagued for decades by pollution from nearby industrial plants. The thirty-five fossil gas turbines that power the supercomputer that runs the company’s AI chatbot, Grok, emit 1,200 to 2,000 tons of nitrogen oxides a year. According to research by the SELC, this is far more pollution than the gas-fired power plant emits across the street from the xAI facility, or than the oil refinery generates down the road. The county leads the state in emergency department visits for asthma. At first, Politico reported in May, none of xAI’s turbines were equipped with pollution controls usually required by the federal government; the company only obtained a permit from the Shelby County Health Department in July, after sustained community criticism. The SELC had to partner with a conservation organization of volunteer pilots to do a flyover just to find out how many gas-fired turbines xAI was using.

Communities in the South are starting to fight back. Last month the SELC alleged that residents still lacked “critical information about financial and environmental impacts” from a “hyperscale data center complex” that public officials and developers hoped to build in Bessemer, Alabama. According to the SELC, the original plan for the center, known as “Project Marvel,” called for the construction of eighteen server farms, each the size of a Walmart Supercenter, on a seven-hundred acre campus that would consume at least two million gallons of water a day and require 1,200 megawatts of power to operate—9 percent of Alabama Power’s total capacity, per some estimates. At a crowded public meeting this August, the Bessemer City Council voted to send the data center project back to the planning and zoning board. One month later, the board voted in favor of the proposed development, which now heads to the city council. Whether or not the development goes through, the episode suggested that ever more people are coming around to the harm that AI and its attendant infrastructure does to our environments and communities. “God’s most important rule is to love your neighbor as yourself,” a local resident named Mary Rosenbloom said to the SELC. “But data centers are anything but good neighbors.”