re:Connect archive

author biography

Andrew Gregg is the founder of re:Connect and owner of Gregg Consulting. His background has been in higher education (in computing) and Christian ministry (as a chaplain).

He is a Fellow of the Institute of Leadership, a Chartered IT professional and currently studying for an MA in Practical Theology.

The Hidden Environmental Toll of AI:

How Data Centres Are Draining Power and Water

As artificial intelligence continues its meteoric rise—powering everything from chatbots to autonomous systems—its physical footprint is often obscured behind slick interfaces and catchy headlines. Behind every “smart” response, there’s a sprawling data centre generating heat, consuming energy, and guzzling water. This reality raises urgent questions: at what cost does AI scale? And are we ignoring a growing environmental debt?

Electricity Consumption: The Power Problem

  1. Explosive Growth in Energy Demand
    AI is driving an unprecedented surge in electricity consumption within data centres. According to the International Energy Agency (IEA), data centre electricity demand could nearly double, reaching ~945 TWh by 2030 under its base case. IEA Crucially, “accelerated servers” (i.e., those used for AI workloads) account for a disproportionate share of this growth, with electricity demand projected to rise by about 30% annually in some scenarios. IEA
  2. Grid Strain and Carbon Risk
    More power demand is not just a matter of flipping a switch. If that electricity comes from fossil-fuel-heavy grids, the carbon emissions multiply. Already, some studies have flagged that data centres’ carbon intensity (CO₂e per kWh) in the U.S. exceeds national averages by a significant margin. arXiv Without rigorous sustainable energy sourcing, scaling AI risks undermining climate goals — the very aspiration many proponents of AI claim to support.

Water Consumption: The Underreported Crisis

  1. Cooling Isn’t Just Air
    Servers working on large AI models generate huge amounts of heat, and cooling them down is not trivial. Many data centres rely on evaporative cooling systems, which consume large volumes of water. The AI Journal+1 To put numbers on it: a 1 MW data centre can use up to 25.5 million litres of water per year just for cooling. World Economic Forum
  2. Indirect Water Use via Electricity
    The water footprint of AI isn’t limited to on-site cooling. The generation of electricity that powers these data centres also consumes water—especially in thermoelectric plants. ITU In fact, some analyses suggest that indirect water use (from power generation) may outstrip direct data-centre water use. ITU
  3. Escalating Global Risk
    According to Morgan Stanley, AI data centres’ water consumption could jump to ~1,068 billion litres annually by 2028 for cooling and electricity generation combined. mint+1 Many of these facilities are in regions already vulnerable to water stress. In the UK, for example, water resource constraints are projected to worsen, raising serious concerns about siting new “AI growth zones.” GOV.UK+1
  4. Life-Cycle Water Usage
    Beyond operations, AI’s water footprint also extends to the manufacturing of hardware: chip fabrication is notoriously water-intensive, requiring ultrapure water. The Economic Times+1 According to academic research, building and training large language models can consume millions of litres of water even before deployment. arXiv

The Bigger Picture: Sustainability vs. Scalability

  • Transparency Gap: Despite the gravity of the issue, data centre operators and tech giants often underreport or obscure their resource use. The UK government’s own report highlighted how water and energy consumption data are “frequently overlooked or underreported,” complicating regulation and sustainability planning. GOV.UK
  • Local Pressure, Global Consequences: In water-stressed regions, placing massive AI data hubs could pit corporate infrastructure against local water needs—consumers, agriculture, ecosystems all potentially losing out. Global Action Plan+1
  • Technological Band-Aids Aren’t Enough: There is growing interest in circular water solutions—closed-loop cooling, water recycling, water replenishment—but adoption lags, and in many places, traditional open-loop systems still dominate. World Economic Forum
  • Regulatory Blindspots: There are increasing calls for mandatory environmental reporting from major tech players. Without such frameworks, we risk locking in unsustainable infrastructure. The Guardian

Why This Matters — and What Can Be Done

  1. Strategic Policy Intervention
    Governments should require transparent reporting of energy and water metrics from data centres, especially those designed for AI. This data is essential for planning, especially in regions already under water stress.
  2. Smart Site Planning
    When selecting locations for AI data centres, planners must account for local water availability, water stress, and grid capacity—not just land cost or proximity to customers.
  3. Investment in Alternative Cooling
    Data centre operators should accelerate adoption of advanced cooling technologies (e.g., liquid immersion cooling, closed-loop systems) that greatly reduce water consumption.
  4. Green Energy Integration
    Pairing data centres with clean power (renewables, low-carbon generation) helps mitigate the carbon and indirect water costs tied to electricity production.
  5. Circular Water Economics
    Tech companies must invest in “water replenishment” projects (e.g., restoring water in stressed basins), wastewater reuse, and innovative water-management models as core parts of their growth strategy—not as peripheral CSR projects.

Conclusion

AI’s promise is undeniable: it has the capacity to transform industries, drive innovation, and unlock new capabilities in science, business, and society. But if we ignore its environmental footprint—especially in terms of energy and water—we risk fueling a technology boom that harms the very planet it purports to help.

Scaling AI sustainably demands more than just faster chips and bigger models; it requires confronting the resource demands of the infrastructure that supports it—and making tough choices about power, water, and the long-term cost of “intelligence.” Without that reckoning, we may be building an AI future on foundations that are anything but green.

Posted in

Leave a comment