AI’s Energy and Resource Footprint: Transforming Global Power Systems

ai energy and resource consumption

The relentless expansion of generative artificial intelligence (AI) has been hailed as a technological revolution, yet it relies on a vast, often invisible physical infrastructure that demands enormous quantities of electricity, water, and critical minerals.

This article aims to uncover this hidden environmental footprint by detailing the scale of consumption and the resulting strains on global resources.

It draws on financial projections from Goldman Sachs, global energy benchmarks from the IEA, academic research, and industry reports to highlight the material costs behind the AI revolution and the challenges of building a truly sustainable digital future.

How Much Energy and Resources Does AI Use

AI infrastructure demands immense electricity and physical resources, driving a new commodity supercycle. Global data center power usage, currently 55 GW, is projected to surge 165% by 2030. This expansion places unprecedented strain on grids and freshwater supplies, with AI estimated to withdraw up to 6.6 billion cubic meters of water globally by 2027.

Key material consumption includes:

Copper: Data center demand will rise 114% to 1 million tonnes by 2030.

Water: GPT-3 training used 5.4 million liters; single requests consume one 500ml bottle per 10–50 prompts.

Lithium: Projected 173% demand increase by 2030 for backup power.

Tin: Refined consumption growing 2.8% annually through 2030.

Battery Metals: Demand for cobalt and nickel increasing 23% to 82%.

Why Does AI Use So Much Energy

The intense resource demands of modern AI stem from the computational requirements for developing and operating large language models (LLMs) and other complex systems.

The Cost of Computation: Training vs. Inference

Training large language models takes a lot of computing power. It involves running thousands of GPUs or TPUs (tensor processing units) for weeks or even months to fine-tune billions of parameters, which uses a huge amount of electricity.

While model training draws much attention, the long-term energy burden comes from inference, the everyday use of AI by billions of users, which already accounts for 80–90% of total AI computing power and continues to rise.

Furthermore, generative AI (GenAI) approaches utilize 30 to 40 times more energy than traditional AI methods. Even small tasks add up; a single ChatGPT request consumes 2.9 watt-hours, and GenAI models like Meta’s LLaMA 405B can consume 17 watt-hours per inference.

Infrastructure and Cooling Requirements

AI workloads necessitate the construction of AI-dedicated data centers featuring high absolute power requirements and higher power density racks.

The density of power use is projected to grow from 162 kilowatts (kW) per square foot today to 176 kW per square foot in 2027 (excluding overheads).

A substantial fraction of the energy consumed by data centers is dedicated to cooling the high heat generated by servers. Approximately 35–40% of a hyperscaler’s energy consumption is from cooling.

This need for heat rejection translates directly into massive water consumption (Scope 1). Water-intensive cooling towers and water evaporation-assisted air cooling are common methods used by major tech companies. Depending on weather conditions and operational settings, data centers can evaporate roughly 1 to 9 liters of water per kWh of server energy.

Impact of AI on Global Electricity Demand

The accelerating AI arms race is driving an unprecedented surge in electricity demand, creating substantial strain on power grids and infrastructure worldwide.

Projected Demand and Infrastructure Strain

Goldman Sachs Research forecasts a sharp rise in data center power demand, up 50% by 2027 and as much as 165% by decade’s end versus 2023, lifting load from 55 GW to 84 GW by 2027. This surge strains power grids, as utilities face transmission expansion delays, supply chain limits, and high upgrade costs. About $720 billion in grid investment may be needed by 2030 globally overall.

Geographic Concentration and Localized Impact

Data center electricity consumption is heavily concentrated, with nearly half occurring in the U.S., 25% in China, and 15% in Europe.

In advanced economies, the IEA estimates that on average, a quarter of electricity demand growth by 2030 will be driven by data centers.

Locally, the impact is even more severe: data centers are projected to account for approximately half of power demand growth in the U.S. and Japan over the next five years.

For instance, data center development in Dominion Energy’s service area in Virginia is projected to double its peak demand over the next 15 years.

The Role of Renewable Energy in Powering AI Technologies

Large technology companies, or hyperscalers (including Meta, Amazon, Google, and Microsoft), are major buyers of clean energy, yet the sheer scale of AI power requirements is challenging their long-standing climate commitments.

The Renewables/Fossil Fuel Dilemma

While these companies are the largest corporate signers of power purchase agreements (PPAs) for renewable suppliers, accounting for 40% of the global corporate total in the first half of 2025, this procurement falls “woefully short” of the estimated 362 GW of additional power the industry needs worldwide by 2035.

The stress is already visible: major tech companies reported significant increases in carbon emissions in their latest climate filings compared to pre-ChatGPT benchmarks, with Microsoft explicitly blaming “growth-related factors such as AI and cloud expansion”.

Fossil fuels remain the dominant energy source, supplying nearly 60% of power to data centers globally. Data center expansion is increasingly used to justify the prolonged use of fossil fuels.

Notably, gas-power generation for data centers is expected to more than double from 120 TWh in 2024 to 293 TWh in 2035, with a significant portion of this growth occurring in the U.S.. Some data center operators are explicitly seeking gas connections to meet their electricity needs.

Challenges in Achieving 24/7 Carbon-Free Energy (CFE)

Achieving ambitious goals like 24/7 CFE for high, constant loads presents significant challenges due to the intermittent nature of wind and solar energy.

Modeling shows that meeting a constant 1 GW load with current technologies (wind, solar, and 4-hour lithium-ion batteries) for 99% of hours requires a nameplate capacity that is 12 times greater than the load. This extreme overbuilding results in substantial capital costs and land requirements.

Consequently, aggressive CFE targets may be more viable through advanced clean technologies such as small modular reactors (SMRs), advanced geothermal, and long-duration energy storage (LDES). Companies are pursuing this “nuclear option,” with Google agreeing to buy nuclear power from a plant in Iowa, and others exploring technologies like carbon capture and storage for gas plants.

The Role of Renewable Energy in Powering AI Technologies

Large technology companies, or hyperscalers (including Meta, Amazon, Google, and Microsoft), are major buyers of clean energy, yet the sheer scale of AI power requirements is challenging their long-standing climate commitments.

Background and Current Bottlenecks

The interaction between rapidly growing data center demand and existing power systems reveals several structural bottlenecks:

  • Grid constraints: Meeting projected demand requires substantial transmission and distribution investment, yet grid expansion is slowed by permitting delays and long construction timelines.
  • Regional concentration: In some regions, data centers already account for more than 20% of national electricity demand, increasing reliance on carbon-intensive generation and stressing local infrastructure.
  • Reliability requirements: AI workloads demand continuous, high-quality power, limiting direct dependence on variable renewable sources without additional system support.

These constraints highlight the need for scalable, low-carbon energy solutions.

Solar PV as a Scalable Solution for AI

Solar photovoltaic energy offers a uniquely scalable pathway for meeting AI-driven electricity demand. Globally, the solar resource reaching Earth exceeds 40 terawatts on average, orders of magnitude larger than current electricity consumption.

In practice, utility-scale solar PV can be deployed within 1–3 years, significantly faster than most conventional generation or transmission projects. Moreover, the levelized cost of electricity from solar has declined by nearly 90% since 2010, making it one of the lowest-cost sources of new power in many regions.

These characteristics have driven hyperscalers to rely heavily on solar PV for rapid, large-scale clean energy procurement aligned with data center expansion.

System Integration and Research Motivation

Despite its scalability, solar generation is inherently variable, while data centers typically require power availability exceeding 99.99%. Achieving high shares of solar therefore requires complementary solutions, including energy storage, flexible load management, and grid coordination. Recent studies suggest that maintaining constant power supply with variable renewables alone may require capacity overbuilds of an order of magnitude relative to average load. These system-level challenges motivate research into how solar-centered energy systems can be designed to reliably and efficiently power AI infrastructure, forming the core focus of this work.

Which investments best benefit from AI’s rapidly growing power demand?

With the dramatic increase in electricity demand driven by artificial intelligence, investors can identify profit opportunities across multiple key sectors, including infrastructure, energy supply, raw material supply chains, and efficiency-enhancing technologies.

Data Center Infrastructure & Operators

  • Hyperscalers (Google, Amazon, Microsoft, Meta): Massive AI capex; returns depend on moving from infrastructure to platforms and applications.
  • Wholesale data center operators: Well-positioned to meet surging global demand.
  • Long-term asset managers: Advantage in funding long development cycles with patient capital.

Utilities & Grid Upgrades

  • Power generation and grid expansion: A key bottleneck; global grid spending may reach ~$720bn by 2030.
  • Utilities: Those that manage long permitting timelines and expand transmission capacity stand to benefit.

Clean & Next-Gen Energy

  • Renewables: Big Tech signing PPAs for solar and wind.
  • 24/7 clean baseload: SMRs, advanced geothermal, long-duration storage (LDES), and CCS.
  • Fusion: AI is helping stabilize plasma, making fusion a long-term investment theme.

Critical Metals & Mining

  • Copper: Core to data centers; demand could more than double by 2030.
  • Tin: Essential for electronics; steady demand growth.
  • Battery metals: Lithium, cobalt, graphite, nickel, manganese for storage and backup power.
  • Rare earths: Vital for chips and wind turbine magnets.

Circular Economy & Efficiency

  • Refurbishment & recycling: Short server lifecycles drive e-waste; refurbished gear can cut costs by up to 70%.
  • Energy-efficient hardware & cooling: AI accelerators and advanced liquid cooling offer strong growth potential.

Discover more from PowMr Community

Subscribe now to keep reading and get access to the full archive.

Continue reading