Generating just 100 words with GPT-4 consumes 3 bottles of water: the environmental cost of AI

A new study has highlighted an environmental problem that often goes unnoticed: water consumption by the artificial intelligence. The growing reliance on language models like GPT-4 is placing enormous demands on resources. While these technologies have transformed the way we work and communicate, their widespread use is taking a heavy toll on the planet.

A study by the University of California, published by The Washington Post, revealed that generating just 100 words with GPT-4 consumes 1,408 liters of water, equivalent to three half-liter bottles. This is due to the enormous energy expenditure required by data centers to process artificial intelligence and cool servers.

The environmental impact is even greater when we consider the millions of queries ChatGPT receives every day, many of them trivial. Not only is water consumed, but also large amounts of electricity to keep the servers running.

According to Computer Hoy, the operation of language models such as GPT-4 requires intensive cloud computing, which involves the use of thousands of chips, especially from Nvidia. These chips generate a lot of heat and need an efficient cooling system, which uses large amounts of water.

The study found that if 10% of Americans used GPT-4 once a week, it would consume the same amount of electricity as the homes in Washington DC would use in 20 days.

Representatives from OpenAI, Meta, Google and Microsoft have reaffirmed their commitment to reducing water consumption in their data centers, but have not yet presented specific solutions.

By Editor

One thought on “Generating just 100 words with GPT-4 consumes 3 bottles of water: the environmental cost of AI”

Leave a Reply