Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on October 11, 2023

Google’s AI could soon consume as much electricity as Ireland, study finds

The servers on which AI models run need a sh*t tonne of juice


Google’s AI could soon consume as much electricity as Ireland, study finds

Amid the debate over the dangers of widespread AI development, an important concern may have been overlooked: the huge amount of energy required to train these large language models.

A new study published this week suggests that the AI industry could consume as much energy as a country like Argentina, Netherlands, or Sweden by 2027. What’s more, the research estimates that if Google alone switched its whole search business to AI, it would end up using 29.3 terawatt-hours per year — equivalent to the electricity consumption of Ireland. 

The paper was published by Alex de Vries at the VU Amsterdam School of Business and Economics. 

In 2021, Google’s total electricity consumption was 18.3 TWh, with AI accounting for 10%–15% of it. However, the tech giant is rapidly scaling the AI parts of its business, most notably with the launch of its Bard chatbot, but also the integration of AI into its search engine. 

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

However, the scenario stipulated by the study assumes full-scale AI adoption utilising current hardware and software, which is unlikely to happen rapidly, said de Vries. One of the main hurdles to such widespread adoption is the limited supply of graphics processing units (GPUs) powerful enough to process all that data. 

While entirely hypothetical, the study casts light on an often unstated impact of scaling up AI technologies. Data centres already use between 1-1.3% of all the world’s electricity and adding AI to existing applications like search engines could rapidly increase the share. 

“It would be advisable for developers not only to focus on optimising AI, but also to critically consider the necessity of using AI in the first place, as it is unlikely that all applications will benefit from AI or that the benefits will always outweigh the costs,” advised de Vries.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with