{"id":191487,"date":"2025-09-30T04:35:07","date_gmt":"2025-09-30T04:35:07","guid":{"rendered":"https:\/\/www.newsbeep.com\/us\/191487\/"},"modified":"2025-09-30T04:35:07","modified_gmt":"2025-09-30T04:35:07","slug":"responding-to-the-climate-impact-of-generative-ai-mit-news","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/us\/191487\/","title":{"rendered":"Responding to the climate impact of generative AI | MIT News"},"content":{"rendered":"<p>In part 2 of our two-part series on\u00a0<a href=\"https:\/\/news.mit.edu\/2025\/explained-generative-ai-environmental-impact-0117\" rel=\"nofollow noopener\" target=\"_blank\">generative artificial intelligence\u2019s environmental impacts<\/a>, MIT News explores some of the ways experts are working to reduce the technology\u2019s carbon footprint.<\/p>\n<p>The energy demands of\u00a0<a href=\"https:\/\/news.mit.edu\/2023\/explained-generative-ai-1109\" rel=\"nofollow noopener\" target=\"_blank\">generative AI<\/a> are expected to continue increasing dramatically over the next decade.<\/p>\n<p>For instance, an April 2025 report from the International Energy Agency predicts that the\u00a0<a href=\"https:\/\/www.iea.org\/news\/ai-is-set-to-drive-surging-electricity-demand-from-data-centres-while-offering-the-potential-to-transform-how-the-energy-sector-works\" rel=\"nofollow noopener\" target=\"_blank\">global electricity demand from data centers<\/a>, which house the computing infrastructure to train and deploy AI models, will more than double by 2030, to around 945 terawatt-hours. While not all operations performed in a data center are AI-related, this total amount is slightly more than the energy consumption of Japan.<\/p>\n<p>Moreover, an August 2025 analysis from Goldman Sachs Research forecasts that about 60 percent of the increasing electricity demands from data centers will be met by burning fossil fuels, increasing\u00a0<a href=\"https:\/\/www.goldmansachs.com\/insights\/articles\/how-ai-is-transforming-data-centers-and-ramping-up-power-demand\" rel=\"nofollow noopener\" target=\"_blank\">global carbon emissions by about 220 million tons<\/a>. In comparison, driving a gas-powered car for 5,000 miles produces about 1 ton of carbon dioxide.<\/p>\n<p>These statistics are staggering, but at the same time, scientists and engineers at MIT and around the world are studying innovations and interventions to mitigate AI\u2019s ballooning carbon footprint, from boosting the efficiency of algorithms to rethinking the design of data centers.<\/p>\n<p>Considering carbon emissions<\/p>\n<p>Talk of reducing generative AI\u2019s carbon footprint is typically centered on \u201coperational carbon\u201d \u2014 the emissions used by the powerful processors, known as GPUs, inside a data center. It often ignores \u201cembodied carbon,\u201d which are emissions created by building the data center in the first place, says Vijay Gadepally, senior scientist at MIT Lincoln Laboratory, who leads research projects in the Lincoln Laboratory Supercomputing Center.<\/p>\n<p>Constructing and retrofitting a data center, built from tons of steel and concrete and filled with air conditioning units, computing hardware, and miles of cable, consumes a huge amount of carbon. In fact, the environmental impact of building data centers is one reason companies like\u00a0<a href=\"https:\/\/www.constructiondive.com\/news\/meta-piloting-mass-timber-for-sustainable-data-center-construction-clt-green-building\/756919\/\" rel=\"nofollow noopener\" target=\"_blank\">Meta<\/a> and\u00a0<a href=\"https:\/\/www.esgdive.com\/news\/microsoft-swaps-concrete-steel-with-wood-in-data-centers-to-cut-emissions\/731912\/\" rel=\"nofollow noopener\" target=\"_blank\">Google<\/a> are exploring more sustainable building materials. (Cost is another factor.)<\/p>\n<p>Plus, data centers are enormous buildings \u2014 the world\u2019s largest, the China Telecomm-Inner Mongolia Information Park, engulfs\u00a0<a href=\"https:\/\/datacentremagazine.com\/top10\/top-10-biggest-data-centres\" rel=\"nofollow noopener\" target=\"_blank\">roughly 10 million square feet<\/a> \u2014 with about 10 to 50 times the energy density of a normal office building, Gadepally adds.\u00a0<\/p>\n<p>\u201cThe operational side is only part of the story. Some things we are working on to reduce operational emissions may lend themselves to reducing embodied carbon, too, but we need to do more on that front in the future,\u201d he says.<\/p>\n<p>Reducing operational carbon emissions<\/p>\n<p>When it comes to reducing operational carbon emissions of AI data centers, there are many parallels with home energy-saving measures. For one, we can simply turn down the lights.<\/p>\n<p>\u201cEven if you have the worst lightbulbs in your house from an efficiency standpoint, turning them off or dimming them will always use less energy than leaving them running at full blast,\u201d Gadepally says.<\/p>\n<p>In the same fashion, research from the Supercomputing Center has shown that \u201cturning down\u201d the GPUs in a data center so they consume about <a href=\"https:\/\/news.mit.edu\/2025\/qa-vijay-gadepally-climate-impact-generative-ai-0113\" rel=\"nofollow noopener\" target=\"_blank\">three-tenths the energy<\/a> has minimal impacts on the performance of AI models, while also making the hardware easier to cool.<\/p>\n<p>Another strategy is to use less energy-intensive computing hardware.<\/p>\n<p>Demanding generative AI workloads, such as training new reasoning models like GPT-5, usually need many GPUs working simultaneously. The Goldman Sachs analysis estimates that a state-of-the-art system could soon have as many as 576 connected GPUs operating at once.<\/p>\n<p>But engineers can sometimes achieve similar results by reducing the precision of computing hardware, perhaps by switching to less powerful processors that have been tuned to handle a specific AI workload.<\/p>\n<p>There are also measures that boost the efficiency of training power-hungry deep-learning models before they are deployed.<\/p>\n<p>Gadepally\u2019s group found that about half the electricity used for training an AI model is spent to get the last 2 or 3 percentage points in accuracy. Stopping the training process early can save a lot of that energy.<\/p>\n<p>\u201cThere might be cases where 70 percent accuracy is good enough for one particular application, like a recommender system for e-commerce,\u201d he says.<\/p>\n<p>Researchers can also take advantage of efficiency-boosting measures.<\/p>\n<p>For instance, a postdoc in the Supercomputing Center realized the group might run a thousand simulations during the training process to pick the two or three best AI models for their project.<\/p>\n<p>By building a tool that allowed them to avoid about 80 percent of those wasted computing cycles, they dramatically reduced the energy demands of training with no reduction in model accuracy, Gadepally says.<\/p>\n<p>Leveraging efficiency improvements<\/p>\n<p>Constant innovation in computing hardware, such as denser arrays of transistors on semiconductor chips, is still enabling dramatic improvements in the energy efficiency of AI models.<\/p>\n<p>Even though energy efficiency improvements have been slowing for most chips since about 2005, the amount of computation that GPUs can do per joule of energy has been improving by 50 to 60 percent each year, says Neil Thompson, director of the FutureTech Research Project at MIT\u2019s Computer Science and Artificial Intelligence Laboratory and a principal investigator at MIT\u2019s Initiative on the Digital Economy.<\/p>\n<p>\u201cThe still-ongoing \u2018Moore\u2019s Law\u2019 trend of getting more and more transistors on chip still matters for a lot of these AI systems, since running operations in parallel is still very valuable for improving efficiency,\u201d says Thomspon.<\/p>\n<p>Even more significant, his group\u2019s research indicates that efficiency gains from new model architectures that can solve complex problems faster, consuming less energy to achieve the same or better results, is doubling every eight or nine months.<\/p>\n<p>Thompson coined the term \u201c<a href=\"https:\/\/ide.mit.edu\/insights\/the-importance-of-aigorithm-efficiency-can-nflops-hold-the-key-to-efficiency-gains\/\" rel=\"nofollow noopener\" target=\"_blank\">negaflop<\/a>\u201d to describe this effect. The same way a \u201cnegawatt\u201d represents electricity saved due to energy-saving measures, a \u201cnegaflop\u201d is a computing operation that doesn\u2019t need to be performed due to algorithmic improvements.<\/p>\n<p>These could be things like \u201c<a href=\"https:\/\/news.mit.edu\/2023\/new-techniques-efficiently-accelerate-sparse-tensors-1030\" rel=\"nofollow noopener\" target=\"_blank\">pruning<\/a>\u201d away unnecessary components of a neural network or employing\u00a0<a href=\"https:\/\/news.mit.edu\/2022\/machine-learning-edge-microcontroller-1004\" rel=\"nofollow noopener\" target=\"_blank\">compression techniques<\/a> that enable users to do more with less computation.<\/p>\n<p>\u201cIf you need to use a really powerful model today to complete your task, in just a few years, you might be able to use a significantly smaller model to do the same thing, which would carry much less environmental burden. Making these models more efficient is the single-most important thing you can do to reduce the environmental costs of AI,\u201d Thompson says.<\/p>\n<p>Maximizing energy savings<\/p>\n<p>While reducing the overall energy use of AI algorithms and computing hardware will cut greenhouse gas emissions, not all energy is the same, Gadepally adds.<\/p>\n<p>\u201cThe amount of carbon emissions in 1 kilowatt hour varies quite significantly, even just during the day, as well as over the month and year,\u201d he says.<\/p>\n<p>Engineers can take advantage of these variations by leveraging the flexibility of AI workloads and data center operations to maximize emissions reductions. For instance, some generative AI workloads don\u2019t need to be performed in their entirety at the same time.<\/p>\n<p>Splitting computing operations so some are performed later, when more of the electricity fed into the grid is from renewable sources like solar and wind, can go a long way toward reducing a data center\u2019s carbon footprint, says Deepjyoti Deka, a research scientist in the MIT Energy Initiative.<\/p>\n<p>Deka and his team are also studying \u201csmarter\u201d data centers where the AI workloads of multiple companies using the same computing equipment are flexibly adjusted to improve energy efficiency.<\/p>\n<p>\u201cBy looking at the system as a whole, our hope is to minimize energy use as well as dependence on fossil fuels, while still maintaining reliability standards for AI companies and users,\u201d Deka says.<\/p>\n<p>He and others at MITEI are building a flexibility model of a data center that considers the differing energy demands of training a deep-learning model versus deploying that model. Their hope is to uncover the best strategies for scheduling and streamlining computing operations to improve energy efficiency.<\/p>\n<p>The researchers are also exploring the use of long-duration energy storage units at data centers, which store excess energy for times when it is needed.<\/p>\n<p>With these systems in place, a data center could use stored energy that was generated by renewable sources during a high-demand period, or avoid the use of diesel backup generators if there are fluctuations in the grid.<\/p>\n<p>\u201cLong-duration energy storage could be a game-changer here because we can design operations that really change the emission mix of the system to rely more on renewable energy,\u201d Deka says.<\/p>\n<p>In addition, researchers at MIT and Princeton University are developing a software tool for investment planning in the power sector, called\u00a0<a href=\"https:\/\/energy.mit.edu\/genx\/\" rel=\"nofollow noopener\" target=\"_blank\">GenX<\/a>, which could be used to help companies determine the ideal place to locate a data center to minimize environmental impacts and costs.<\/p>\n<p>Location can have a big impact on reducing a data center\u2019s carbon footprint. For instance, Meta operates a\u00a0<a href=\"https:\/\/datacenters.atmeta.com\/wp-content\/uploads\/2025\/02\/Meta_s-Lulea-Data-Center.pdf\" rel=\"nofollow noopener\" target=\"_blank\">data center in Lulea<\/a>, a city on the coast of northern Sweden where cooler temperatures reduce the amount of electricity needed to cool computing hardware.<\/p>\n<p>Thinking farther outside the box (way farther), some governments are even exploring the construction of\u00a0<a href=\"https:\/\/spectrum.ieee.org\/data-center-on-the-moon\" rel=\"nofollow noopener\" target=\"_blank\">data centers on the moon<\/a> where they could potentially be operated with nearly all renewable energy.<\/p>\n<p>AI-based solutions<\/p>\n<p>Currently, the expansion of renewable energy generation here on Earth isn\u2019t keeping pace with the rapid growth of AI, which is one major roadblock to reducing its carbon footprint, says Jennifer Turliuk MBA \u201925, a short-term lecturer, former Sloan Fellow, and former practice leader of climate and energy AI at the Martin Trust Center for MIT Entrepreneurship.<\/p>\n<p>The local, state, and federal review processes required for a new renewable energy projects can take years.<\/p>\n<p>Researchers at MIT and elsewhere are exploring the use of AI to speed up the process of connecting new renewable energy systems to the power grid.<\/p>\n<p>For instance, a generative AI model could streamline interconnection studies that determine how a new project will impact the power grid, a step that often takes years to complete.<\/p>\n<p>And when it comes to <a href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3485128\" rel=\"nofollow noopener\" target=\"_blank\">accelerating the development and implementation of clean energy technologies<\/a>, AI could play a major role.<\/p>\n<p>\u201cMachine learning is great for tackling complex situations, and the electrical grid is said to be one of the largest and most complex machines in the world,\u201d Turliuk adds.<\/p>\n<p>For instance, AI could help optimize the prediction of solar and wind energy generation or identify ideal locations for new facilities.<\/p>\n<p>It could also be used to perform predictive maintenance and fault detection for solar panels or other green energy infrastructure, or to monitor the capacity of transmission wires to maximize efficiency.<\/p>\n<p>By helping researchers gather and analyze huge amounts of data, AI could also inform targeted policy interventions aimed at getting the biggest \u201cbang for the buck\u201d from areas such as renewable energy, Turliuk says.<\/p>\n<p>To help policymakers, scientists, and enterprises consider the multifaceted costs and benefits of AI systems, she and her collaborators developed the Net Climate Impact Score.<\/p>\n<p>The score is a framework that can be used to help determine the net climate impact of AI projects, considering emissions and other environmental costs along with potential environmental benefits in the future.<\/p>\n<p>At the end of the day, the most effective solutions will likely result from collaborations among companies, regulators, and researchers, with academia leading the way, Turliuk adds.<\/p>\n<p>\u201cEvery day counts. We are on a path where the effects of climate change won\u2019t be fully known until it is too late to do anything about it. This is a once-in-a-lifetime opportunity to innovate and make AI systems less carbon-intense,\u201d she says.<\/p>\n","protected":false},"excerpt":{"rendered":"In part 2 of our two-part series on\u00a0generative artificial intelligence\u2019s environmental impacts, MIT News explores some of the&hellip;\n","protected":false},"author":2,"featured_media":191488,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[46],"tags":[111557,111558,191,111555,1873,111559,111560,111556,111554,74,111553],"class_list":{"0":"post-191487","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-computing","8":"tag-ai-carbon-footprint","9":"tag-ai-emissions","10":"tag-computing","11":"tag-deepjyoti-deka","12":"tag-generative-ai","13":"tag-generative-ai-energy-use","14":"tag-generative-artificial-intelligence-emissions","15":"tag-jennifer-turliuk","16":"tag-neil-thompson","17":"tag-technology","18":"tag-vijay-gadepally"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/191487","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/comments?post=191487"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/191487\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media\/191488"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media?parent=191487"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/categories?post=191487"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/tags?post=191487"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}