Addressing AI’s energy-guzzling nature all comes down to using artificial intelligence thoughtfully, according to a recent Accenture report.
Artificial intelligence, which gobbles up data center power, is running away with available resources. “The infrastructure fueling the AI revolution is consuming unprecedented amounts of electricity and water at rates that rival entire nations,” according to a new analysis out of Accenture. Such energy-guzzling by AI is likely to grow if not addressed.
Over the next five years, AI data centers could consume up to 612 terawatt-hours, which is equivalent to Canada’s total annual electricity consumption, the report’s authors, led by Stephanie Jamison, state. AI’s share of global power consumption is set to rise from 0.2% in 2024 to 1.9% by 2030, an extraordinary pace of 48% CAGR, outpacing the expected 1.5% growth rate in overall electricity demand growth in this period.
There’s another critical resource at risk as well: “these data centers are predicted to guzzle more than 3 billion cubic meters of water annually – more than the total annual freshwater withdrawals of countries like Norway or Sweden.”
The Accenture team recommends AI itself as an effective approach to reigning in AI consumption. “With AI’s energy footprint growing relentlessly, AI governance-as-code — embedding sustainability into automated compliance systems — is key to reducing risks, lowering costs and building resilient, future-proof AI ecosystems,” they advise.
AI-driven automation “can help enforce sustainability policies and manage environmental risks in real time,” they continue. “Automation can also make it easier to select the most sustainable infrastructure for each model deployment.”
Tools and platforms such as the Cloud Native Computing Foundation’s Kepler project (Kubernetes-based Efficient Power Level Exporter) can help “support energy-aware AI scheduling.” In addition, intelligent workload orchestration tools such as Karmada (Kubernetes-based Multi-Cloud, Multi-Cluster Orchestrator) can help “optimize AI workloads across regions based on carbon intensity.
Accenture also developed what it calls a Sustainability-Adjusted Intelligence Quotient (SAIQ), a measure of how efficiently AI systems convert money, electricity, water and carbon into actual performance. SAIQ is a composite efficiency score that measures the environmental and economic performance of AI systems, weighing factors such as costs, electricity consumption, carbon emissions, and water intake.
See also: AI Model Mimics Brain Neurons to Reduce Energy Costs
Curbing AI’s Energy-Guzzling
The Accenture authors offer numerous suggestions for bringing AI energy consumption under contreol, including the following:
Deploy AI at the edge: This “cuts down on cloud use and improves performance by reducing latency. Edge AI applications are especially well-suited to industries that rely on real-time data processing, such as manufacturing, healthcare, retail and financial services.”
Adopt dynamic scaling and smart load balancing: “Match energy use to AI workloads.” Design AI infrastructure “with energy proportionality in mind, drawing on principles like dynamic efficiency scaling to optimize power use across AI workloads. Reduce peak energy demands by using adaptive scheduling to shift AI processing to times when power is cheapest and cleanest.”
Choose right-size AI models: “Instead of defaulting to large-scale LLMs, deploy
task-specific AI models. Techniques like Retrieval-Augmented Generation (RAG) can
reduce inference costs by accessing data only when needed,”
It all comes down to using AI thoughtfully, the Accenture team states. “The paradox of AI is that it can be used both more selectively and more broadly to reduce its impacts. Businesses that get it right can drive sustainability, profitability and competitiveness to new heights.”