“Plug, baby, plug! Electricity is available. You can plug. It’s ready!” ~ Emmanuel Macron
In the always beautiful city of Paris, world leaders are gathered this week to talk about the future of AI, at the Artificial Intelligence Action Summit. At this summit, France unveils its plans to build 35 data centers, spread throughout the country. Currently, there are almost 250 data centers in France already, which still looks like a small number compared to the 2000 data centers in the US. But with a 30-50 billion dollar investment of Emirati allies, France promotes itself as the country taking the lead in the AI-race in Europe, with concrete action.
During his (in)famous inauguration speech, Trump also mentioned his administration’s plan to invest hundreds of billions into AI infrastructure, and in the mean time the EU has followed suit by announcing a 200 billion euro investment as well.
This made me wonder; what is the impact of AI on the environment, how much energy does it actually consume? So I set out on a mission to answer a few questions:
How much impact do data centers have on the environment through their energy consumption?
How much of the data center energy demand is - and will be - driven by AI?
Would limiting our usage of AI tools make a material difference for the planet?
Can AI have a positive impact on the climate and environment?
What is the environmental impact of data centers around the world?
Besides energy consumption, we should also account for the used space, deforestation, rare earth resources, water for cooling, chip manufacturing processes and the logistics required to build data centers, but unfortunately I’m just a mere mortal with only a few hours of spare time in the weekend to write about AI, so I will just focus on energy usage of existing and the newly announced data centers.
Here are the numbers I found:
There are 11 000 registered data centers around the world.
Data centers account for around 1% in global electricity usage.
Data centers account for around 2-4% of electricity usage in the EU, USA and China.
Data centers account for up to 20% of electricity usage locally (e.g. in Ireland), because they are clustered in localised areas.
40% of data center power usage is for compute and 40% is used for cooling.
To put this in perspective:
Electric Vehicle power usage at the same time was around 0.5% of global electricity usage.
Annual electricity consumption from data centres globally is about half of the electricity consumption from household IT appliances, like computers, phones and TVs.
Part of these numbers come from an IEA (International Energy Agency) report, which uses numbers from 2023, the year where OpenAI launched GPT-4. Since that year, AI usage and energy demand have risen significantly. This makes it necessary to predict the future to truly understand the energy usage of data centers and AI.
What makes this extra difficult is that almost every report or article of the IEA mentions that we need “better stocktaking” for data center electricity usage. In other words: we don’t know their energy usage precisely! Hopefully that will change when companies adhere to the new EU AI act, which demands that high-risk AI systems keep track of their energy consumption.
Predictions for the future energy consumption vary wildly:
IEA: a range between 620-1050 TWh in 2026, with their “base case” for energy demand at just over 800 TWh.
Goldman Sachs: data centers will consume up to 4% of the total world electricity by 2030. Their model also predicts a data center energy usage of 1063 TWh by 2030, of which 209 TWh by AI globally.
International Data Corporation (IDC): data centers will use 857 TWh by 2028.
McKinsey: data centers in the United States to reach 606 TWh by 2030.
Gartner: data centers will require 550 TWh per year by 2027.
Deloitte: data centers will use 536 terawatt-hours (TWh), in 2025. Global data center electricity consumption could roughly double to 1,065 TWh by 2030.
How much of the data center energy demand is - and will be - driven by AI?
Now that we have some rough numbers for data center electricity consumption, it’s time to aggregate them and see how much of that is used by AI on a yearly basis. Since we are a few months into 2025 at the time of writing, let’s take 620 TWh as the power consumption of all data centers combined. This is at the lower end of IEA’s prediction for next year (2026) and higher than Deloitte’s prediction for this year.
To get an accurate number for which percentage of data center capacity is spent on AI training and inference, we can take the average of the following predictions:
45% by McKinsey
~ 14% by Goldman Sachs
14.2% by the International Data Corp in Sept. 2024
10-20% by Electric Power Research Institute
30-40% by the IEA, but this includes cryptocurrencies power usage
The huge outlier here is McKinsey’s number, and I wonder if they also counted cryptocurrencies in their prediction. Since these numbers were published a few months or longer ago, and considering we’ve very recently heard of significant investments into AI-infrastructure, maybe we can keep these higher estimates in our average to get to a more accurate percentage of data center power consumption spent on AI.
After some quick maths, the average I got is 24.64%. This comes down to ~153 TWh of energy consumption for all AI-related tasks globally this year. This number doesn’t tell us much without being able to compare it to power generation and consumption from other sources:
The USA produced about 4029 TWh of electricity in 2023.
France produces 320 TWh in nuclear energy per year.
254 TWh per year is used by US homes for airconditioning.
Cruise ships are estimated to use 85-250 TWh per year.
A single nuclear plant’s energy production is: 2.6 TWh (small), 8.8 TWh (medium), 14 TWh (large) and 38 TWh (huge).
The US industry directly used 1020 TWh in electricity, not counting fossil fuel usage, which comes down to 6609 TWh per year.
The US transportation sector uses 8188 TWh in fossil fuels per year.
Putting it in perspective, the data center energy usage is definitely significant, but it seems that maybe the challenges aren’t just in energy consumption of the AI-models, but also in the fact that data centers put a lot of pressure on local energy grids and that they require a lot of (fresh) water for cooling. It goes beyond the scope of this opinion-piece, but I think these issues can be solved short-term.
Should you limit your AI usage, to save the planet?
This is a personal choice for everyone, but in any case it’s nice to base it on real numbers. Real numbers, however, are annoyingly difficult to obtain. You can’t just divide total AI power consumption by the 1 billion ChatGPT queries per day; ChatGPT isn’t the only LLM on the market and LLMs are not the only energy consuming models. Image and video generators are likely to be much more resource-intensive per query.
Looking at available data, researchers at the end of 2023 published their estimate of 2.9Wh per ChatGPT query, which does not include training of the model. For ease of use, let’s double this number for 2025, because inference / querying LLMs is probably much more costly than a year and a half ago.
If one ChatGPT query currently uses 5.8Wh, and an average user makes 2920 queries per year, then the following energy consumption sources would be equivalents:
2920 ChatGPT queries, the average number of queries per user, per year
~ 8.5 hours or one working day of running your A/C (2 kWh)
~ 8.5 hours of using your oven (2 kWh)
~ 21 hours of gaming on a pc (0.8 kWh)
~ 11 hours of using an electric heater (1.5 kWh)
~ 1.5 hours of charging your electric car (11 kWh)
Google Search queries used 0.3 watts per request in 2023, but if you’ve Googled anything recently, you’ll notice that a lot of the search results are AI-enhanced nowadays. This concretely means that we can’t tell people to “use Google, don’t ChatGPT it” any longer. The same research article by Alex de Vries, that is quoted so often to argue that ChatGPT uses 10x more energy per query than a Google search, also makes a prediction for how much AI-enhanced Google queries will consume: 7 to 9 Wh per query.
We can’t tell people to use Google instead of ChatGPT any longer.
Furthermore, saving computer time and getting correct information quickly can also have a positive impact on your energy usage. Does it make sense to limit your LLM usage based on these numbers? You decide!
Positive impact
When evaluating the environmental impact of AI, we should keep in mind that AI can help us shape and prepare for the future too. Not all power consumption is a net-loss for the environment. AI already plays a huge role in climate change prevention and adaptation.
Thanks to AI we can predict weather and climate, map our oceans’ plastic concentration, effectively reforest with drones, recycle waste more efficiently, and provide life-saving information during heatwaves, hurricanes and floods. Finally, when properly trained, AI can help influence the public’s opinion massively and help us all to prevent further climate change, and adapt to the inevitable changes.
For me personally, I don’t see a future path where AI does not play a huge role in our daily lives. There is no alternative to using LLMs for getting information, especially now that Google also uses AI in its search results. The potential of AI to have a positive impact outweighs the downside that comes with its energy consumption. Eventually it is up to you to make a decision on how you will use artificial intelligence in your daily tasks, but hopefully now you know a little bit more about how much power your queries actually consume!
Sources
EU Commission: Artificial Intelligence – Questions and Answers
IEA: What the data centre and AI boom could mean for the energy sector
IEA: Global growth in final electricity demand by use in the Stated Policies Scenario, 2023-2030
Goldman Sachs: AI is poised to drive 160% increase in data center power demand
McKinsey: AI power: Expanding data center capacity to meet growing demand
Gartner: Gartner Predicts Power Shortages Will Restrict 40% of AI Data Centers By 2027
Yale: As Use of A.I. Soars, So Does the Energy and Water It Requires
Sifted: UAE to invest billions building new AI data centre in France
LesEchos: Data centers, services publics : le nouveau plan du gouvernement pour doper l'IA en France
DataCenterDynamics: France and UAE to invest billions into 1GW European AI data center
DataCenterDynamics: Equinix launches data center in Paris, France
Ministères Écologie Territoires: Forum dédié à l’IA durable, à l’Hôtel de Roquelaure
DW News: 'Plug, baby plug': Macron envisions France's future AI as electricity-powered | DW News
Time.com: How AI Is Fueling a Boom in Data Centers and Energy Demand
US Energy Information Administration: Use of energy explained
US Bureau of Transportation Statistics: U.S. Consumption of Energy from Primary Sources by Sector
EPRI: Powering Intelligence, Analyzing Artificial Intelligence and Data Center Energy Consumption
Alex de Vries: The growing energy footprint of artificial intelligence
GreyParriot: 2022 wrapped: Changing the world's relationship with waste
World Economic Forum: 9 ways AI is helping tackle climate change