© 2020 – 2024 AEA3 WEB | AEAƎ United Kingdom News
AEA3 WEB | AEAƎ United Kingdom News
Image default
IT

AI’s energy use will fall over time, says DeepMind sustainability lead

The intense climate impact from the rise of generative AI models will decline as systems become more energy efficient and tech firms move from building to updating, according to Google DeepMind’s sustainability lead.

Speaking to UKTN, Drew Purves acknowledged that the massive demand for powerful new AI models, including Google’s Gemini, has led to skyrocketing energy and compute costs.

According to one study, generating an image using an AI model uses as much energy as fully charging your smartphone.

When people submit such requests in the millions, this quickly adds up. The climate impact of this increased energy usage, as well as the resources required to sustain the massive data centres that AI models rely on, has consequently caused major concern among environmentalists.

According to Purves, these concerns are justified – but will likely not be a problem in the future.

Purves said that while companies like Google and OpenAI have been building these powerful AI models in recent years, the compute requirements will fall once the foundations are firmly in place.

“At the same time as the increasing demand” of both compute per model and “societal demand in AI”, Purves said, “there are a whole bunch of methodologies that are learning how to do more with less in AI as well”.

The early stages of building foundational AI models require these significant energy and compute costs. However, according to Purves, those costs will go down as those building the models move on from the initial construction to what is called “inference mode”.

AI training is energy-intensive

Inference mode refers to when foundational AI models are deployed and actively making outputs based on data, compared with the training phase in which large volumes of data are being fed into the model.

“Running the foundational models in inference mode is a lot cheaper than training them,” Purves explained. “The compute is really about training, that’s the massive bottleneck.”

While it’s difficult to get a precise figure for the amount of energy required to train a large language model like GPT-3, one estimate puts it at just under 1,300 megawatt hours (MWh) of electricity. That’s roughly the same electricity consumption as 480 UK houses annually.

High demand for running models in inference mode will still increase the compute and energy usage and additional training will still be required to “fine-tune” the foundational models. But Purves said this phase can “typically be done on a very small amount of data and a small amount of training in comparison to the original model”.

The DeepMind sustainability lead added that when foundational AI models are created, the energy savings can further fall through a process called “distillation”, the process of transferring knowledge from a large model to a smaller one.

“We’re finding that if you begin with the foundational model, and then you bring in some specialist data… you can then train a very small model that can be incredibly useful and accurate.”

Purves also noted that while compute usage has increased immensely, the “amount of compute you get not just per dollar spent but per energy spent is coming down very quickly”.

The upfront costs mean only firms with vast resources are able to build truly large AI models. Microsoft, Google, Amazon and Meta are among the top players vying for AI dominance.

Purves said that while these companies compete over the capabilities of their models, increased environmental pressure will inspire them to also compete on “resource efficiency”.

Speaking at London-based sustainability event Earthfest, Purves added that the potential for AI to be used in the fight for climate sustainability was worth the costs.

Examples of this from the UK include Treefera, a startup using AI to map trees for carbon credit verification, and Sorted, which uses AI and lasers to identify recyclable items at waste facilities.

The post AI’s energy use will fall over time, says DeepMind sustainability lead appeared first on UKTN.

Related posts

UK government launches international tech strategy as tool of diplomacy

AEA3

London fintech Revolut launches new Payday feature in the UK

AEA3

Building distributed systems requires effective developer teams

AEA3