AI and Energy Consumption
The question isn’t whether AI consumes energy, it obviously does but whether, taken as a whole and over time, AI is helping or hurting climate goals. Instead thinking of the tech as inherently “good” or “bad” for the environment, AI can reduce emissions in some contexts while increasing them in others, sometimes at the same time.
The Good, the Bad and…
AI systems already improve efficiency across industries. Machine learning is used to optimize power grids and balance energy demand, and enable precision agriculture that reduces fertilizer and pesticide use.
At the same time, the computing cost behind AI is substantial. Training large models can require energy comparable to hundreds of households over a year. Related research also highlights the Matthew Effect, the idea that “the rich get richer.” When tasks are simple, many participants can compete effectively. But as difficulty increases, small advantages in hardware, capital, or efficiency compound rapidly. Over time, this can lead to dominance by a very small number of actors.
The type of AI usage also matters. Text-based interactions are relatively lightweight. Video generation, and real-time streaming, by contrast, require a lot more computation due to frame-by-frame processing. This matters because video already dominates global internet traffic, and AI-enhanced video compounds that demand as adoption grows.
“While image generators used the equivalent of five seconds of microwave warming to generate a single 1,024 x 1,024 pixel image, video generators proved far more energy-intensive. To spit out a five-second clip, the researchers found that it takes the equivalent of running a microwave for over an hour.”
…the Best Practices / AI energy efficiency?
ML’s carbon footprint can be dramatically reduced through recommended best practices, except in learning these, I felt some were out of reach. So I considered what’s ‘best’ versus what’s ‘realistic’:
1. Model — Pick a smarter design
A. Some AI models are designed to skip unnecessary calculations, getting the same answer with way less work. Realistic: For simple questions use smaller/simpler models. You don’t need GPT-5 to summarize an email or answer a factual question. A single GPT-5 query consumes approximately 18 watt-hours of electricity on average roughly 60 times more than GPT-4o’s 0.3 watt-hours, according to University of Rhode Island researchers.
B. Text over images, images over video.
C. Researchers found that shortening instructions and responses reduces energy consumption. Every “regenerate response” click runs the full computation again. If the answer is close enough, edit it manually.
D. Fine-tuned models can use 30x less energy than general-purpose chatbots, more on this in a future article.
2. Machine — Use the right tool for the job
Specialized chips (like Google’s TPUs) use less energy, but lends back to the Matthew Effect.
3. Map — Pick the right provider based on energy.
OpenAI / ChatGPT Runs primarily on Microsoft Azure infrastructure.
Microsoft is matching 100% of the energy used by its Iowa data centers with renewable energy. Simple analogy: It’s like saying “I offset all my driving by paying someone else to plant trees.” The trees help, but your car still burned gasoline. Microsoft is buying renewable energy and taking other steps to meet its goal of being carbon negative by 2030 and further aims to power all facilities with 100% renewable energy by 2025.
Realistic: it is the end of 2025 – and it looks like they fell short, and are projected to fall short in 2030
Claude (Anthropic)
Runs across AWS, Google Cloud, and Azure. AWS matched 100% of electricity with renewables in 2024 and is the largest corporate renewable energy buyer for five years running.
Realistic: Anthropic does not publish independent energy or emissions data.
Google / Gemini
Google has matched 100% of its electricity use with renewable energy credits every year since 2017. Worldwide, the company has about 66% of its data center consumption, matched to the hour, powered by carbon-free electricity. In 2024, Google signed agreements for more than 8 GW of clean energy generation, the most it has contracted in any year. Google is attempting to run on 24/7 carbon-free energy on every gridwhere they operate by 2030. This means every hour of every day, the actual electricity flowing into their facilities comes from clean sources. This is vastly harder.
Realistic: Google’s total electricity demand more than doubled from 2020 to 2024 due to AI growth. So 2030 target is unlikely.
Microsoft is doing the worst, overall. They are all slow to get to carbon neutral, in part or even primarily because of AI! That’s not good. Choose wisely.
Time will Tell.
Early AI adoption drove efficiency gains; over time, environmental costs accumulated. Both perspectives, AI as helpful and harmful, are correct, depending on when you look. And the pendulum will likely swing again when the big companies invest in our power grids overall.
I’m not trying to land on a conclusion here. I just wanted to understand. The data suggests this isn’t a simple story, but it’s worth paying attention.
Co-written with AI. I brought the curiosity; I’m learning to bring the energy awareness too.


Comments