ChatGPT seems immaterial when you address a question and the text scrolls back in milliseconds, yet behind each exchange, recent power consumption estimates trace a lattice of servers and cables.
Analysts now suggest that ChatGPT’s annual energy use may soon rival the electricity drawn by significant slices of the New York City grid, turning a conversational bot into an infrastructure-scale consumer and sharpening arguments over efficiency, climate risk, and the hidden price of convenience for people and economies.
What the numbers say about ChatGPT’s energy footprint
BestBrokers, using figures from the University of Rhode Island AI Lab, estimates that each ChatGPT prompt draws around 18.9 watt-hours, or 0.0189 kWh. Multiplied by global traffic, that estimate hints at a growing energy footprint for everyday AI conversations worldwide.
The study links that single-prompt figure with publicly available traffic data from OpenAI’s chatbot. By merging this benchmark with detailed usage statistics, BestBrokers scales ChatGPT activity to an annual power draw. The team outlines a transparent data methodology and treats the 18.9 Wh value as typical per-query consumption for current deployments.
From one query to 17 terawatt-hours : how usage scales
BestBrokers estimates that around 810 million people use ChatGPT in a typical week, sending roughly 22 prompts each. That behaviour leads to more than 2.5 billion queries every day and turns a single 18.9 Wh response into a sizeable, steady flow of electricity.
From that traffic, the analysts derive around 47.2 million kWh of power use each day and roughly 17.228 billion kWh per year, or 17.23 TWh. Such growth is closely tied to ChatGPT’s rapid scaling with users and an expanding base of weekly active users worldwide. Taken together, billions of requests per day translate into an aggregate electricity demand that now rivals the annual consumption of places such as Puerto Rico, Slovenia or Costa Rica.
Comparisons with New York City and national electricity use
BestBrokers sets ChatGPT’s projected 17.23 TWh of yearly electricity use against real-world power systems. On that basis, the chatbot alone could keep New York City running for around 113 days, covering roughly four months of typical citywide consumption across homes, subway lines and office towers.
Across entire countries, the report finds that this power could run the United States for about 34 hours and the United Kingdom for nearly 20 days. Its brief national grid comparison adds that 17.23 TWh equals 29 household kWh per day for every US home over more than four and a half days, or roughly 238 million full charges of 72.4 kWh electric vehicles.
Methodology and caveats behind the headline figures
BestBrokers bases its calculations on work from the University of Rhode Island AI Lab, which places a single ChatGPT response at 0.189 kWh of electricity. That benchmark and these data sources, combined with traffic estimates and a US commercial tariff of 0.141 dollars per kWh in September 2025, define the model’s annual bill.
Alan Goldberg of BestBrokers notes that the totals describe only inference, not the training cycles behind OpenAI’s announced GPT‑5.2, where a single upgrade may use tens of gigawatt-hours. Their use of broad assumptions and margins yields wide uncertainty ranges, turning 17.23 TWh into an approximate rather than exact figure.












