top of page

Making an Image with Generative AI Uses as Much Energy as Charging Your Phone - MIT Technology Review

While training AI models is highly energy intensive, most of their carbon footprint comes from their use (inference).

SELF-REPORTING

OpenAI’s Sam Altman claimed that an average ChatGPT query uses “roughly one-fifteenth of a teaspoon” of water and “about 0.34 watt-hours” in a recent blog post. Multiply that by billions of queries and you start to see the scale of the problem.

​

OpenAI's Sam Altman Reveals How Much Energy A ChatGPT Query Needs - NDTV

STUDIES

AI_impact1.jpg

The energy demands of AI are staggering and accelerating quickly. According to Lawrence Berkeley National Laboratory, data centers in the U.S. consumed 4.4% of U.S. national electricity consumption in 2023.

​

AI on the Edge: Can Distributed Computing Disrupt the Data Center Boom? - Power Mag

A single day of ChatGPT queries can cost around 1 GWh, which is the equivalent of the daily energy consumption for about 33,000 U.S. households collectively.

 

Q&A: UW researcher discusses just how much energy ChatGPT uses - University of Washington Press

AI_impact7.jpg

The lowest consuming AI video generators take 3.4 million joules to spit out a five-second, 16fps video, which is the equivalent to running a microwave for over an hour. Multiply that by the number of users and regenerated prompts. 

​

We Did the Math on AI's Energy Footprint. Here's the Story You Haven't Heard - MIT Technology Review

AI_impact2.jpg

In the US, the energy grid used to power data centers is still heavily reliant on fossil fuels, and surging demand for immediate energy is only making that worse. For example, Elon Musk's xAI data center outside of Memphis uses 35 methane gas generators to keep its chips humming.

 

How Much Electricity It Actually Takes to Use AI May Surprise You - Futurism​

this website was created without the use of artificial intelligence

bottom of page