Powering AI Could Use as Much Electricity as a Small Country

Artificial Intelligence comes with many promises. It says to help coders code faster, help drivers drive safer and make daily tasks less time-consuming.

In an article published October 10 in the journal Joule, the researcher demonstrates that the tool, when adopted widely, will have a large energy footprint, which may exceed the power demands of some countries in the future.

“Looking at the growing demand for AI service, it’s very likely that energy consumption related to AI will significantly increase in the coming years,” says Alex de Vries, the paper’s author at Vrije Universiteit Amsterdam.

Since 2022, generative AI, which can produce text, images or other data, has undergone rapid growth, including OpenAI’s ChatGPT. Training these AI tools requires feeding the models with a large amount of data, a process that is energy-intensive.

Hugging Face, an AI-developing company based in New York, reported that its multilingual text-generating AI tool consumed about 433 megawatt-hours (MWH) during training, enough to power 40 average American homes for a year.

And AI’s energy footprint does not end with training. De Vries’s analysis shows that when the tool is put to work—generating data based on prompts— every time the tool generates a text or image, it also uses a significant amount of computing power and thus energy.  For example, ChatGPT could cost 564 MWh of electricity a day to run.

While companies around the world are working on improving the efficiencies of AI hardware and software to make the tool less energy-intensive, de Vries says that an increase in machines’ efficiency often increases demand. In the end, technological advancements will lead to a net increase in resource use, a phenomenon known as Jevons’ Paradox.

“The result of making these tools more efficient and accessible can be that we just allow more applications of it and more people to use it,” de Vries says.

Google, for example, has been incorporating generative AI in the company’s email service and is testing out powering its search engine with AI. The company processes up to 9 billion searches a day currently. Based on the data, de Vries estimates that if every Google search uses AI, it would need about 29.2 TWh of power a year, which is equivalent to the annual electricity consumption of Ireland.

This extreme scenario is unlikely to happen in the short term because of the high costs associated with additional AI servers and bottlenecks in the AI server supply chain, de Vries says. But the production of AI servers is projected to grow rapidly in the near future.

By 2027 worldwide AI-related electricity consumption could increase by 85.4–134.0 TWh of annual electricity consumption from newly manufactured servers.  This figure is comparable to the annual electricity consumption of countries such as the Netherlands, Argentina and Sweden.

Moreover, improvements in AI efficiency could also enable developers to repurpose some of the older computer processing chips for AI, which could further increase AI-related electricity consumption.

“The potential growth highlights that we need to be very mindful about what we use AI for. It’s energy-intensive, so we don’t want to put it in all kinds of things where we don’t actually need it,” de Vries concludes.

News Desk

UNLOCK Blockchain News Desk is fueled by a passionate team of young individuals deeply immersed in the world of Blockchain and Crypto. Our mission? To keep you, our loyal reader, on the cutting edge of industry news. Drop us a line at info(@)unlock-bc.com to connect with our team and stay ahead of the curve!

Related Articles

Back to top button