Skip to main content

°F AI: AI and Energy Consumption

AI and Energy Consumption

The question isn’t whether AI consumes energy, it obviously does but whether, taken as a whole and over time, AI is helping or hurting climate goals. Instead thinking of the tech as inherently “good” or “bad” for the environment, AI can reduce emissions in some contexts while increasing them in others, sometimes at the same time.

The Good, the Bad and…

AI systems already improve efficiency across industries. Machine learning is used to optimize power grids and balance energy demand, and enable precision agriculture that reduces fertilizer and pesticide use.

At the same time, the computing cost behind AI is substantial. Training large models can require energy comparable to hundreds of households over a year. Related research also highlights the Matthew Effect, the idea that “the rich get richer.” When tasks are simple, many participants can compete effectively. But as difficulty increases, small advantages in hardware, capital, or efficiency compound rapidly. Over time, this can lead to dominance by a very small number of actors.

The type of AI usage also matters. Text-based interactions are relatively lightweight. Video generation, and real-time streaming, by contrast, require a lot more computation due to frame-by-frame processing. This matters because video already dominates global internet traffic, and AI-enhanced video compounds that demand as adoption grows.

“While image generators used the equivalent of five seconds of microwave warming to generate a single 1,024 x 1,024 pixel image, video generators proved far more energy-intensive. To spit out a five-second clip, the researchers found that it takes the equivalent of running a microwave for over an hour.”

 

…the Best Practices / AI energy efficiency?

ML’s carbon footprint can be dramatically reduced through recommended best practices, except in learning these, I felt some were out of reach. So I considered what’s ‘best’ versus what’s ‘realistic’:

1. Model — Pick a smarter design

A. Some AI models are designed to skip unnecessary calculations, getting the same answer with way less work. Realistic: For simple questions use smaller/simpler models. You don’t need GPT-5 to summarize an email or answer a factual question. A single GPT-5 query consumes approximately 18 watt-hours of electricity on average roughly 60 times more than GPT-4o’s 0.3 watt-hours, according to University of Rhode Island researchers.

 

B. Text over images, images over video.

 

C. Researchers found that shortening instructions and responses reduces energy consumption. Every “regenerate response” click runs the full computation again. If the answer is close enough, edit it manually.

 

D. Fine-tuned models can use 30x less energy than general-purpose chatbots, more on this in a future article.

 

2. Machine — Use the right tool for the job

Specialized chips (like Google’s TPUs) use less energy, but lends back to the Matthew Effect.

 

3. Map — Pick the right provider based on energy.

Time will Tell.

Early AI adoption drove efficiency gains; over time, environmental costs accumulated. Both perspectives, AI as helpful and harmful, are correct, depending on when you look. And the pendulum will likely swing again when the big companies invest in our power grids overall.

I’m not trying to land on a conclusion here. I just wanted to understand. The data suggests this isn’t a simple story, but it’s worth paying attention.

Co-written with AI. I brought the curiosity; I’m learning to bring the energy awareness too.

Comments

Leave a comment