OpenAI CEO Sam Altman says the average ChatGPT query uses 'about 1/15 of a teaspoon' of water.

By
Due to the rapid spread of AI, the energy demand of data centers around the world is expected to reach the same level as Japan's power consumption by 2026. However, not only energy but also the consumption of water resources required for data center cooling systems is a problem. OpenAI CEO Sam Altman revealed that the average amount of water used by a ChatGPT query is about 0.000085 gallons (about 0.32 ml).
The Gentle Singularity - Sam Altman
https://blog.samaltman.com/the-gentle-singularity
Sam Altman claims an average ChatGPT query uses 'roughly one fifteenth of a teaspoon' of water | The Verge
https://www.theverge.com/news/685045/sam-altman-average-chatgpt-energy-water
In a blog post on June 11, 2025, CEO Altman said, 'We've flown beyond the event horizon, and humanity is getting closer to building digital superintelligence, but at least for now, nothing as strange as expected has happened. For example, robots aren't walking the streets yet, and most people don't talk to AI all day. People are still dying from illness, we can't easily go into space, and there's still a lot we don't know about there,' predicting that the technological singularity will not happen suddenly, but will come as a gradual change.
also, here is one part that people not interested in the rest of the post might still be interested in: pic.twitter.com/ANDhHu9g3g
— Sam Altman (@sama) June 10, 2025
On the other hand, AI is already contributing to the world in various ways, with scientists saying things like 'productivity has increased by two to three times compared to before the introduction of AI,' and ChatGPT in particular is becoming more powerful than any human being who has ever lived.
It is also believed that technological progress will accelerate exponentially if AI, which is software, becomes capable of creating hardware such as robots.
Altman described this self-replication of AI as a 'flywheel that compound-interests the construction of infrastructure to run AI systems,' and predicted that it won't be long before AI infrastructure is built by AI.
According to Altman, if a system for building AI systems using AI, which he called a 'data center that builds data centers,' could be realized, the costs of operating AI could also be significantly reduced.
'As data center operations become more automated, the cost of intelligence should eventually converge to electricity costs,' Altman said. 'Many people are interested in how much energy a ChatGPT query consumes, but the average query consumes about 0.34 watts per hour, which is roughly the same amount of energy an oven consumes in just over a second, or a high-efficiency light bulb consumes in a few minutes. It also consumes about 0.000085 gallons of water, which is about 1/15 of a teaspoon.'
The Verge, an IT news site, has asked OpenAI how this figure was calculated, but has not received a response. How much water an AI consumes depends on the location of its data center, but according to data calculated in 2024 by the Washington Post, a foreign media outlet, in collaboration with AI researchers, a 100-word email generated by an AI chatbot using GPT-4 would require just over one bottle of water.

Altman believes that reducing the running costs of AI and making superintelligence more accessible is one of the best ways to avoid serious risks posed by AI.
Altman concludes his article by saying, 'Intelligence too cheap to measure is within our reach. It may sound crazy to say this, but if we were in 2020 and predicted where we are today in 2025, it would probably sound crazier than our current predictions for 2030. My hope is that superintelligence will allow us to scale up smoothly, exponentially, and successfully.'
Related Posts:
in Software, Posted by log1l_ks