You type a question. ChatGPT answers in two seconds. You close the tab and move on with your day. What you didn't see — what you were never meant to see — is what happened in between. The servers that woke up to process your request. The electricity they burned. The water that was quietly consumed to stop those servers from melting. All for one prompt.
This isn't a climate-crisis warning for 2050. This is already happening, at a scale that most people — including most people in tech — have genuinely no idea about. And the companies responsible for it are, at best, vague about the numbers. At worst, deliberately evasive.
So let's do the math ourselves.
01Why Does AI Even Need Water?
AI models don't run on your laptop. They run on massive data centres — warehouses packed wall to wall with thousands of high-performance GPUs, running at full tilt, 24 hours a day. These chips generate enormous amounts of heat. That heat has to be managed, or the hardware fails.
The answer to that heat problem is, largely, water. Data centres use evaporative cooling systems — cooling towers, liquid loops, chilled water circuits — that consume water to regulate temperature. When the water evaporates, it takes the heat with it. Efficient physics. Enormous scale.
And AI workloads are the most compute-intensive workloads on earth right now. Training a frontier model, running inference at billions of requests per day, processing images, audio, and video — all of it generates heat, all of it requires cooling, and all of it consumes water.
Water used in AI cooling is largely "consumed" in the true sense — it evaporates into the atmosphere and does not return to the local water supply. It is gone. Unlike energy (which can be offset by renewables), water consumed in cooling is a direct, irreversible draw on local water systems — often in regions already under water stress.
The Numbers That Should Genuinely Concern You
In late 2025, researcher Alex de Vries-Gao at Vrije Universiteit Amsterdam published one of the most comprehensive attempts to calculate the global environmental footprint of AI systems. The findings — published in the scientific journal Patterns — were striking, not just because of the scale, but because of how hard the numbers were to come by in the first place.
To understand the trajectory, here's how the numbers break down across key metrics:
| Metric | Figure |
|---|---|
| AI's share of global data centre energy (2024) | 15–20% |
| US data centre electricity share of national grid (2023) | ~4% |
| Projected US data centre electricity share (2028) | Up to 12% |
| Global data centre demand growth (2022 → 2026) | More than doubled |
| Projected global demand by 2030 | ~945 TWh |
| Cumulative data centre emissions (2025–2030) | ≈ 40% of US annual total |
| US total electricity demand increase by 2033 vs 2024 | ~20% |
For scale: New York City emitted approximately 52.2 million tonnes of CO₂ in 2023. The de Vries-Gao study calculates that AI could be responsible for 32.6 to 79.7 million tonnes annually. That's not a projection for some distant future — that's the 2025 range, based on current AI demand.
Per Prompt: What Does One AI Query Actually Cost?
This is where it gets personal. The aggregate numbers are hard to feel — millions of tonnes, hundreds of terawatt-hours. Let's scale it down to a single interaction.
Multiple independent research groups have attempted to calculate the cost of a single AI prompt. The range is wide, because it depends on the model, the data centre location, the length of the query, and the infrastructure design. But here's the honest picture:
- A single ChatGPT prompt consumes approximately 0.3 to 3 Wh of electricity — compared to roughly 0.0003 Wh for a standard Google search. That's a gap of up to 10,000x.
- OpenAI CEO Sam Altman disclosed in 2025 that a typical ChatGPT prompt uses approximately 0.34 Wh and 0.3 mL of water. But independent researchers note this figure likely excludes indirect water use — the water consumed to generate the electricity powering the servers in the first place.
- Research from UC Riverside estimated that Microsoft's US data centres use roughly 500 mL of water for every 10 to 50 medium-length GPT-3-scale responses — or 10 to 50 mL per prompt.
- At one billion weekly users interacting with generative AI, even the lower per-prompt estimates compound into billions of litres of water consumed every week.
- For image generation, video generation, and multi-step agentic workflows, the energy and water costs multiply significantly — often by 5x to 20x over a simple text query.
These are estimates because companies don't disclose per-query figures. The numbers we have come from researchers reverse-engineering public data — GPU specs, energy contracts, revenue figures, and facility disclosures. The real numbers may be higher.
AI uses roughly 10,000x more electricity per query than a Google search. Most people using AI every day have no idea they're making that trade.
— Vikas Patel, Manas AIWhy Doesn't Big Tech Just Tell Us?
Because there's no legal requirement to, and the numbers aren't flattering.
The de Vries-Gao study specifically called out the opacity of major technology companies on this issue. There is currently no standardised framework for AI-specific environmental reporting. Companies disclose what they choose to, in whatever format suits their PR, at whatever cadence they prefer.
The most revealing example: Google, in an explanatory note accompanying its environmental impact report for the Gemini model, explicitly stated it did not wish to report indirect water use because it does not fully control water consumption at the power plants that supply its electricity.
Think about that for a moment. The water used to generate the electricity that powers the AI — the majority of the water footprint — is simply excluded from reporting, because Google doesn't technically own those power plants.
There is no standard for AI environmental reporting. Companies publish aggregate figures, exclude indirect consumption, and change their methodology between reports. Researchers like de Vries-Gao are forced to build models from fragments of public data — hardware specs, revenue disclosures, facility counts — to get any picture at all. The picture they get is already alarming. The real picture is likely worse.
The Trajectory: Getting Worse, Not Better
You might assume that as AI hardware gets more efficient, the environmental impact will stabilise or improve. That's partly true at the chip level — newer GPUs do more compute per watt. But it's being entirely overwhelmed by growth in demand.
- Global data centre electricity demand is projected to nearly double by 2030, from ~415 TWh to ~945 TWh, with AI workloads accounting for a disproportionate share of that growth.
- AI-driven data centre buildout in the US is expected to increase total national electricity demand by 9% by 2028 and nearly 20% by 2033, compared to 2024 levels — in a country where demand had been essentially flat for two decades.
- Water consumption from data centre cooling could double or quadruple current levels by 2028, according to projections from UC Riverside researchers.
- Cumulative data centre CO₂ emissions between 2025 and 2030 could equal 40% of the United States' total annual emissions — equivalent to the emissions of over 540 million petrol cars.
- New model generations are not just more capable — they're also being used for more complex, more compute-heavy tasks. The tasks themselves are getting bigger even as the chips get better.
Efficiency improvements help. They are not coming anywhere close to keeping pace with the growth in usage.
06Does This Mean We Should Stop Using AI?
No. That isn't the argument, and it wouldn't be honest to frame it that way.
AI has genuine, measurable value — in drug discovery, in climate modelling, in accessibility, in productivity tools that genuinely improve people's working lives. The goal isn't to switch it off. The goal is to make better decisions about how it's built, where it runs, and what it's used for.
Three things need to happen that aren't happening fast enough:
- Mandatory environmental disclosure. AI companies should be required to publish per-model energy and water consumption data — not vague, curated aggregate figures. Every model card should have an energy and water footprint, in the same way nutritional information appears on food packaging. You have a right to know what you're consuming.
- Smarter model selection. Not every task needs a frontier model. A 7B parameter model running locally costs a fraction of a GPT-4-class API call for tasks like summarisation, classification, or simple Q&A. Developers and businesses who are intentional about this can reduce their AI footprint by 90% for certain workflows without meaningful quality loss.
- Geography-aware infrastructure. The location of a data centre and the source of its electricity matters more than almost any other factor. The same computation performed on a grid powered by Icelandic geothermal energy versus coal-heavy grid electricity has a wildly different carbon and water profile. Where AI runs is a design choice — and most companies are not making it consciously.
What This Means If You're Building with AI
If you're a startup founder, product manager, or business owner building AI-powered products — or just using AI tools daily — this isn't only an ethical consideration. It's increasingly a commercial and regulatory one.
- Enterprise clients in the EU are beginning to ask about AI carbon footprints as part of supplier due diligence and procurement. This will accelerate as the EU AI Act and related sustainability regulations come into force.
- Model efficiency is becoming a genuine competitive differentiator. Faster, cheaper, and lower-footprint models win contracts — particularly in markets where sustainability commitments matter to the buyer.
- Agentic AI workflows — where one task triggers dozens of sub-tasks, each making API calls — multiply the environmental cost in ways most developers haven't thought about. A poorly designed agent can use 50x the compute of a simple query chain to achieve the same outcome. Architecture matters.
- For any company with sustainability reporting obligations, AI usage will increasingly need to be accounted for. Getting ahead of this now — understanding your stack's footprint and making deliberate choices — is far cheaper than retrofitting later.
At Manas AI, when we design custom AI agents and automation workflows for clients, model selection and task architecture are part of the brief from the start. Not just performance and cost — but efficiency. A well-designed agent does more with fewer calls, uses the right model for each task, and doesn't spin up compute unnecessarily. That's good engineering. And it turns out it's also good for the planet.
The Bottom Line
AI is not magic. It runs on physical infrastructure that consumes real resources — electricity, water, land, and hardware manufactured at enormous environmental cost. The invisibility of that infrastructure doesn't make the cost disappear. It just means you don't see the bill.
Every ChatGPT prompt costs something. Every image generation, every document summary, every agentic workflow. Right now, that cost is largely invisible, broadly unregulated, and growing exponentially.
The least we can do — as users, as builders, as people who care about the world we're building AI into — is know the number. Ask the question. Demand the disclosure. And make more deliberate choices about what we build and how we build it.
The water isn't going to ask for itself.
Building with AI? Let's build it right.
At Manas AI, we design custom AI agents and automation workflows with efficiency built in from the start — right model, right task, right architecture. For startups and SMBs across India, the UK, and the US.
Explore Manas AI →Sources & References
- de Vries-Gao, A. (2025). AI's hidden carbon and water footprint. Vrije Universiteit Amsterdam / Patterns journal.
- Food & Water Watch (February 2026). A No Brainer: How AI's Energy and Water Footprints Are Growing.
- Patel, R., Mahalingam, N., Patel, A. (2025). The Environmental Impact of AI Servers and Sustainable Solutions. Rutgers University / arXiv:2601.06063.
- Li, S. et al. (2025). Making AI Less Thirsty: Uncovering and Addressing the Secret Water Footprint of AI Models. UC Riverside / arXiv:2304.03271.
- Google (2025). Measuring the environmental impact of delivering AI at Google Scale. arXiv:2508.15734.
- International Energy Agency (2024). Data Centres and Data Transmission Networks — Electricity use estimates.
- MIT CSAIL / Bashir, N. (2025). Explained: Generative AI's environmental impact. MIT News.



