Artificial intelligence requires increasingly large amounts of electricity, and that's not all – Just one standard set of questions for ChatGPT consumes an estimated bottle of water

Artificial intelligence, cryptocurrencies and data centers consumed a total of 460 terawatt hours of electricity in 2022, according to the International Energy Agency IEA:n from a recent assessment. The amount corresponded to two percent of the world’s electricity consumption.

However, the estimate of consumption growth is shocking. In 2026, the IEA predicts that the industry will use slightly more than 800 terawatt hours of electricity in the base scenario and up to 1,050 terawatt hours in the extreme scenario.

For comparison, according to Statistics Finland, electricity was consumed for all purposes in 2023 at just under 80 terawatt hours.

Extensive language patterns consume electricity

LUT University professor SpongeBob Ritalan According to

“The training of large language models requires a lot of computing power and it requires a lot of electricity. Another thing where electricity is used is the computing power for when the models are used. One use of the language model takes much more computing power than, for example, a Google search,” explains Ritala.

AI SiloCEO of Peter Sarlin shares an understanding of the enormous energy demand of large language models. However, he reminds that a significant part of artificial intelligence solutions are not large language models.

“Most of the artificial intelligence solutions that generate value and are in production are narrow artificial intelligence and focused. Of course, the reality is that they are growing in number and size all the time, but not all artificial intelligence goes into large models.”

For example, the artificial intelligence used in household appliances such as vacuum cleaners or ovens performs a very limited task. The artificial intelligence used in the oven can focus on adjusting the temperature of the oven and this, in turn, requires relatively little computing power.

Consumption is on a long growth path

Ritala says that attention is constantly paid to the energy efficiency of artificial intelligence in research and product development. The computing power required for the same functions and thus the energy used is constantly decreasing.

However, this does not mean that the electricity consumption used for artificial intelligence will decrease in the future as well. According to Ritala, the use of current artificial intelligence applications is expected to continue to grow.

In addition, in the future, the utilization targets of artificial intelligence are expected to expand even further to, for example, moving images and three-dimensional modeling. These, in turn, consume even more computing power and energy, Ritala points out.

The emissions that come with the consumption of electricity

Increased electricity consumption brings with it a problem: the expansion of the electricity-consuming industry should be carried out in a climate-friendly manner.

According to Sarlin, the main challenge is precisely the energy used by large data centers. The computing power required by artificial intelligence is often produced in large data centers that need energy and cooling.

Cooling data centers, on the other hand, requires a lot of water. News agency AP’s story assesses, that ChatGPT consumes about half a liter of water when the user asks it a typical series of questions.

Making a data center carbon neutral is basically straightforward. The electricity used by the center only has to be carbon neutral.

Many companies in the industry strive for exactly this. In addition, the availability of carbon-neutral electricity is one key attractive factor for data center investments, the IEA report states. According to the organization, the availability of cheap and emission-free electricity has attracted investments, especially in the Nordic countries.

Some of the energy used by data centers can still be recovered in the form of heat. For example, about a fifth of a supercomputer is created from Kajaani’s district heat LUMI:n from waste heat.

“Although it is exceptional, Lumi is not the only one,” adds Sarlin.

Depends on the production

Both Sarlin and Ritala agree that society’s electricity consumption will grow strongly in the coming years and decades.

In addition to the huge data centers of artificial intelligence companies, the devices that use it will also require more electricity in the future. Ritala emphasizes that the only way to cover the increased demand carbon-neutrally is to increase carbon-neutral energy production.

“The problem should be solved from the power generation side,” Ritala sums up.

Fact

Familiar terms

With generative artificial intelligence refers to artificial intelligence models designed to produce new content in the form of written text, audio, images or videos.

Large language patterns are one example of generative artificial intelligence. They learn connections between words from large materials, such as which words appear in which environment and which often follow each other. This allows the models to predict what words would be in a suitable answer to a given question.

With traditional artificial intelligence refers to artificial intelligence systems that can perform certain tasks by following predefined rules or algorithms.

By Editor

One thought on “Artificial intelligence requires increasingly large amounts of electricity, and that’s not all – Just one standard set of questions for ChatGPT consumes an estimated bottle of water”

Leave a Reply