The accelerating development and adoption of Artificial Intelligence could fundamentally change how we work and our lives as a direct consequence. Yet, while we talk extensively about the impact AI could have on many areas of the economy, we tend to overlook its impact on the environment.
In September 2024, Microsoft and U.S. energy company Constellation announced plans to restart energy production at Pennsylvania’s Three Mile Island nuclear power plant. The facility, which had ceased operations in 2019, would power Microsoft’s data centers through a roughly 20-year fixed-price agreement. The economic and clean energy benefits for Microsoft are so important that they outweigh potential PR downsides about the brand’s association with the worst nuclear accident in U.S. history.
Microsoft’s Deal
The deal comes amid a growing electricity demand spurred by the Artificial Intelligence boom and highlights the conundrum that Microsoft and its competitors face since AI has become the new hot thing in tech. All major global corporations have been working hard to lower their environmental footprint for at least a decade, with tech giants often spearheading this effort.
On the other hand, the fear of being left behind in AI by competing companies has suddenly forced big tech’s hand and has proved to be a stronger driving force than out-of-fashion environmental concerns.
Official numbers paint a clear and grim picture: Microsoft, a major investor in OpenAI, has placed a massive multi-billion dollar bet on integrating generative AI in its products. Earlier this year, it announced that its CO2 consumption has risen about 30% since 2020. Google, which uses its data centers to train and run its own Gemini family of generative models, is in a similar position—its 2023 emissions are about 50% higher than 2019.
READ ALSO
The Energy Cost of Training AI Models
It might be just the beginning. According to the latest International Energy Agency (IEA) report on global energy consumption, data centers, cryptocurrency, and artificial intelligence collectively utilized almost 2% of global electricity in 2022. The demand is expected to double by 2026. That would be equivalent to Japan’s yearly electricity usage.
The scale of consumption is explained by AI’s specific energy needs. Training a large language model like GPT-3 requires 1,300 megawatt-hours of electricity. The consumption is estimated to be even higher for later models such as GPT-4 and GPT-4o.
The higher energy needs won’t stop at the training phase, though, and extend to inference, the process that happens when the model is queried by users. While a standard Google search uses 0.3 watt-hours on average, a single ChatGPT query is estimated to consume a staggering 2.9 watt-hours. The IEA calculates that if we switched all daily global web search queries (about 9 billion interactions) to only LLM-based searches, the models would require 10 terawatt-hours annually. That’s about the electricity consumption of 1.5 million EU citizens.
As Sasha Luccioni, Artificial Intelligence Researcher & Climate Lead at AI company Hugging Face, explains in a recent interview with Vox, the energy needs of AI seem to be unbridled from any environmental concern:
“Every generation of GPUs — the specialized chips for training AI models — tends to consume more energy than the previous generation. They’re more powerful, but they’re also more energy-intensive. And people are using more and more of them because they want to train bigger and bigger AI models. It’s kind of this vicious circle. When you deploy AI models, you have to have them always on. ChatGPT is never off.”
Nuclear Energy: A Potential Solution for AI’s Power Needs
Microsoft’s deal with Constellation is easy to understand through this lens. Nuclear power could serve as a clean alternative to renewables, guaranteeing that companies can continue investing in energy-intensive AI development while keeping their emissions in check.
Despite the usual caveats about nuclear risk and waste disposal, nuclear is the only viable, clean option to satisfy the tech sector’s growing energy needs. After all, the alternatives would be even worse for the global climate.
In Virginia’s so-called Data-Center Alley, for example, several energy companies have already delayed the phase-out plans of their coal plants to keep up the energy supply in one of the U.S.’s most data-center dense regions. Moreover, according to a Barclay’s report, access to gas pipelines for energy generation remains the major preference when companies select a location for data-center development.
While finding the right energy mix to balance AI development and environmental goals will remain a primary challenge for tech companies, artificial intelligence could help us assuage the heightened emission burden it contributes to create by rebalancing consumption in other areas.
AI systems, for example, could significantly reduce energy consumption in manufacturing by optimizing processes through real-time monitoring and adaptive control—all while running efficient smaller models on standard computers.
Moreover, AI could play a fundamental role in managing the grid, helping to balance and maximize renewable sources by overcoming their intermittent availability. It is still challenging to predict whether the clear benefits and optimizations will outweigh the energy requirements of the ever-growing and more advanced foundational AI models the tech industry is heavily investing in.