Quick introduction

It’s all anyone seems to talk about, the biggest buzzword of 2025: AI. This is obviously a vast topic to talk about and honestly you could write a book about it - many already have. But what I want to focus on is this: what is the environmental price of the AI Chat bots we are all becoming so accustomed to using every single day?

To be clear, I think AI chats are absolutely phenomenal and, if used well, can significantly increase efficiency in many areas of our lives. Whilst I believe there is no substitute for human input and creativity, and thank gosh for that else I’ll be out of job, I do think AI has a huge amount to offer not just to other creatives, but all kinds of industries and workflows.

With that said however, we should absolutely be in the know of what is going on behind the screen when we are using these tools as it feels like that is not yet common knowledge. So in this article, I’m going to discuss some of them - without agenda, just to raise some awareness for whoever is interested.

Just because it’s free, doesn’t mean there isn’t a price.

The actual cost of building artificial intelligence products is not easy to measure - but one thing is for sure: they use a lot of water.

AI systems run on powerful machines, supercomputers, that generate loads of heat and need to be kept cool. This is mainly done with water. Data centres in Iowa, for instance, consumed over 11.5 million litres of water in a single month during the training of GPT-4 (AP News, 2023).

That’s a fairly alarming figure - especially considering how rapidly our reliance on AI tools has grown in just a couple of years - but only if this is not using closed-loop cooling (recycled water).

Some AI systems do use closed-loop cooling, where water is reused through a sealed cycle. Some even use greywater (recycled wastewater). But many still rely on evaporative cooling, where water is lost into the air and must be continually refilled. So, while there are some improvements in how sustainably cooling can be done, right now it still poses a significant environmental question: can this ever truly scale efficiently?

Energy hungry

In a similar vein to its water usage, AI is also extremely power-intensive - especially when it comes to training models. For example, training GPT-3 is estimated to have used 1,287 megawatt-hours of electricity - roughly the same as what 120 average U.S. homes consume in a year - and produced over 550 tonnes of CO₂ emissions (Wired, 2023).

But it’s not just training. Even everyday use such as chatting with AI consumes far more electricity than most people might realise. A single AI prompt can use nearly 10 times the energy of a Google search (IEA via WSJ, 2024). Why? Because that one prompt triggers a cascade of processes:

  • Massive servers spin into action to generate a response;
  • Your data has to be transmitted, processed, and stored;
  • Infrastructure - from routers to cooling systems - must remain online 24/7;

So the “cloud” that powers AI is very much grounded in real-world energy use which is often powered by fossil fuels depending on where the data centre is located (IEA, 2024).

So, how can we all use AI responsibly?

AI is not as innocent as we all thought. However It doesn’t mean we have to swear it off altogether. In fact, far from it. But we can start being more intentional with how we use it.

Ask yourself:

  • Do I really need to use AI, or can I find the solution on my own with a bit of good old-fashioned brainpower?
  • Am I just looking for a quick and easy answer?

Here are some ways you can use AI responsibly and efficiently:

Use AI when it adds value, not just to fill silence or dodge thinking;

  • Try to batch your prompts;
  • Avoid generating unnecessary content;
  • Be mindful of how often you’re leaning on it for things you could do yourself;
  • And more than anything else, avoid jumping on AI image trends - these are by far the worst offenders for energy use, and for what? To delete the image shortly after?

I am really interested to see how platforms start investing in greener infrastructure, and I have no doubt they will. Like it or not, AI is here now and will likely only become more embedded in our day-to-day lives so my hope is we find ways to reduce its carbon footprint.

AI tools are powerful - and, as we know, with power comes a bit of responsibility. As individuals, ultimately being aware of the impact of our actions is the first and necessary step.

Even just knowing the cost behind the convenience? That’s a great place to start.

Daniel Morris author

Daniel Morris

Founder, LUX

More