top of page
Search

How much water does ChatGPT actually use? Let's talk about it.

  • Writer: Pamela Minnoch
    Pamela Minnoch
  • May 12
  • 2 min read

There's been a lot of back-and-forth lately about AI's environmental footprint, especially when it comes to how much water tools like ChatGPT use.


Depending on where you look, you'll find people raising serious concerns, and others rolling their eyes at the whole debate.


So...what's actually going on here?


How much water does AI use, and should we be worried about it?


Let's break it down.


The concern: AI uses a lot more water than most people realise.

One of the main criticisms of large AI models (like ChatGPT) is the sheer amount of resources required to run them.


We're not just talking about electricity. Water plays a key role too, mainly to cool down massive data centres that keep these systems running.


And when you look at it at scale, yes, it adds up.


Some estimates suggest that generating just a few dozen prompts can use up half a litre of water, mostly indirectly, through cooling systems.


That might sound like a small amount until you consider millions of users and billions of queries. And that's not even counting the enormous resources used to train these models in the first place.


So, from this perspective, critics argue that we need to take AI's environmental impact seriously, including its water consumption.


The pushback: Compared to what?

On the flip side, others say the argument is missing context.


They point out that water is used in everything, from watching Netflix to scrolling on Instagram to eating a cheeseburger. And not just a little.


To raise and feed a single cow to make one hamburger? We're talking over 2,000 litres of water.


That dwarfs the daily use of AI tools by most people.


The argument here is: if we're going to talk about water use, we should look at all areas of our lives, not just AI. Especially when AI has the potential to improve productivity, education, healthcare, and more.


So...who's right?

Honestly? Both sides make valid points.


It is important to understand the environmental impact of generative AI. But it's also true that we tend to single out new technologies without looking at the bigger picture.


It's not about taking a hard "pro" or "anti" AI stance, it's about being clear-eyed and consistent.


If we're going to talk about AI's water usage, we need to talk about:

  • How much water is used to train and run these tools

  • How it compares to other digital habits (like streaming and gaming)

  • And how all of that stacks up against other common behaviours (like diet and transport)


It's not helpful to cherry-pick numbers or use them to win an argument. What we do need is more transparency, more context, and more nuance in how we evaluate all technologies, including AI.


So, what can professionals and business leaders take away from this?

If you're using or planning to use AI in your business:

  • Be curious about the environmental impact, not just the productivity gains

  • Keep an eye out for vendors and tools that are transparent about their resource use

  • And encourage informed conversations, not polarised ones


Tech evolves fast. Our understanding and our decision making needs to evolve with it.


Do you think we're asking the right questions when it comes to AI and sustainability? Or are we too focused on one piece of the puzzle?

 
 
 

Comments


bottom of page