‘It takes a lot of energy to train a human too’: OpenAI’s Sam Altman counters AI power criticism

AhmadJunaidBlogFebruary 22, 2026358 Views


OpenAI CEO Sam Altman on Saturday countered claims about the environmental cost of artificial intelligence (AI), rejecting widely shared estimates about ChatGPT’s water and energy use. 

He rejected the claim that it costs 17 gallons of water per query on ChatGPT. “Water is totally fake; it used to be true. We used to do evaporative cooling in data centers, but now that we don’t do that,” he said while speaking to The Indian Express. “You see these things on the internet like ‘don’t use ChatGPT, it’s 17 gallons of water for each query or whatever’. This is completely untrue. Totally insane. No, no connection to reality.”

Also read: ‘India well positioned to lead and shape AI’, says Sam Altman, a sharp turn from his 2023 ‘hopeless’ remark

Altman acknowledged that the broader question of electricity use is valid, though not at the level often claimed per query. “What is fair, though, is the energy consumption, not per query, but in total, because the world is now using so much AI, which is real, and we need to move towards nuclear or wind and solar very quickly.”

The environmental impact of artificial intelligence has come under increasing scrutiny. The United Nations said in January this year that global electricity demand is growing rapidly and is expected to increase by more than 10,000 terawatt-hours by 2035 – equivalent to the total consumption of all advanced economies today.

The International Energy Agency has said data-centre demand rose by more than three quarters between 2023 and 2024, and could account for over 20 per cent of electricity-demand growth in advanced economies by 2030. In the United States, AI-driven data processing is projected to exceed the combined electricity use of aluminium, steel, cement and chemical production by the end of the decade.

Altman also disputed a claim attributed to Bill Gates that a single ChatGPT query consumes the equivalent of one iPhone battery charge – down from an earlier estimate of 10. “There’s no way it’s anything close to that much. It’s way, way, way less,” he said. 

He argued that comparisons between AI and human energy use are often framed incorrectly. “One of the things that is always unfair in this comparison is people talk about how much energy it takes to train an AI model relative to how much it costs a human to do one inference query,” he said.

“But it also takes a lot of energy to train a human,” the OpenAI CEO said. “It takes 20 years of life and all of the food you eat during that time before you get smart. And not only that, it took the very widespread evolution of the hundred billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever to produce you and then you took whatever you took.”

According to Altman, the more meaningful comparison is between a trained AI model and a trained human performing a task. “So the fair comparison is if you ask ChatGPT a question, how much energy does it take once its model is trained to answer that question versus a human?” he said. “And probably AI has already caught up on an energy efficiency basis. Measured that way.”
 

0 Votes: 0 Upvotes, 0 Downvotes (0 Points)

Leave a reply

Loading Next Post...
Search Trending
Popular Now
Loading

Signing-in 3 seconds...

Signing-up 3 seconds...