August 22, 2024
The proliferation of artificial intelligence has raised questions about how much energy it uses. A single query in a generative AI model may use as much energy as turning on a lightbulb for one hour. Major tech companies now say they may struggle to meet their climate goals because AI usage has led to increased energy demand.
Can AI be more sustainable?
A growing chorus of experts hope the answer is “Yes.”
Data centers already account for 1-1.5% of global electricity use, according to the International Energy Agency, and the boom in artificial intelligence, which has triggered a surge in spending by hyperscalers, could drive that number up even further over the next few years. The organization predicts that the total electricity consumption by data centers around the world could more than double by 2026, with renewables alone unable to account for this demand.
“Large-scale AI is just getting started,” said IEEE Senior Member Euclides Chuma. Researchers are exploring hardware and software solutions to lower AI’s energy consumption while making it more powerful.”
Why Does AI Use So Much Energy?
Answering what seems like a simple question isn’t straightforward. The term artificial intelligence means many different things, and there are many kinds of AI.
On the one hand, there are AI systems that have been in use for years — the kind that analyzes your shopping behavior on e-commerce sites and recommends products, for example. And then there are the newer generative AI systems, which can create written and visual content with just a few keystrokes.
The shopping algorithm usually uses little energy when it predicts what kind of t-shirt you might want to buy. Generative AI systems use more, and there are a lot of people using them. According to one survey of generative AI usage in six highly developed countries, 27% of respondents reported using generative AI in their private life, and 21% reported using it at work or at school. And while most of these people don’t use generative AI every day, a small but significant proportion of users do.
AI systems don’t just consume energy during operations. Training AI systems consumes a lot of energy because it uses a lot of computational power, which burns electricity.
“Current AI systems consume a lot of energy due to the large amount of data used in their training,” said IEEE Member Edson Prestes. “These systems need to learn from a huge set of data coming from different sources and of different types. Due to the massive amount of data, AI training is not performed using a simple computer. It generally involves a cluster of computers working together.”
An article in IEEE Spectrum estimates just how much. According to the article, “researchers have estimated that training the state-of-the-art language generation model GPT-3 took weeks and cost millions of dollars. It also required 190,000 kilowatt-hours of electricity, producing the same amount of CO2 as driving a car a distance roughly equivalent to a trip to the moon and back!”
Putting AI on an Energy Diet
Computing was primarily performed on mainframe computers in the 1960s. They were large and, by today’s standards, not very powerful. A typical smartphone has more computing power, consumes less energy, and fits inside your pocket.
Many experts believe that AI will follow a similar path, becoming more powerful and efficient as improvements are made.
Changes in the types of hardware used in the training of models would reduce energy consumption. One method could involve the type of memory storage relied upon during training.
“Using non-volatile memory rather than volatile memory (e.g., DRAM) could reduce energy consumption since the data in such memories would not need to be refreshed,” said 2024 IEEE President Tom Coughlin.
Other steps might mark the path to a greener artificial intelligence:
Improvements to data center cooling: Computers generate heat, and it takes energy to cool the servers on which AI runs. Improving the cooling systems of data centers would reduce the energy consumption attributed to AI.
Use less data: AI models do not always need to be massive. A smaller model might result in slightly lower accuracy but offer excellent energy efficiency.
Use renewable energy: Although many data centers already use renewable energy, not all do, and renewable energy isn’t always available. In some locations, increased energy demand has been met with traditional sources with larger greenhouse gas emissions.
Don’t use AI: As useful as artificial intelligence is, sometimes it isn’t better than traditional methods. In some instances, we might be better off not using AI.
Learn More: If you want to learn more about the building blocks of sustainable AI, check out this panel moderated by 2024 IEEE President Tom Coughlin.