BEREA, Ky. — OpenAI CEO Sam Altman is pushing back hard on one of the stickiest criticisms of generative AI: that each chatbot prompt gulps down enormous amounts of local water and energy.
Speaking at the Express Adda event hosted by The Indian Express last week, Altman didn’t mince words. He stated that claims about ChatGPT using “17 gallons of water per query” are “completely untrue,” adding that “water is totally fake” as a criticism in the way it is often repeated online.
As an IT engineer who has spent 30 years working around servers and networking infrastructure, I can tell you that the truth lies somewhere between Altman’s absolute dismissal and the internet’s apocalyptic viral memes.
💦 Why Altman Says the Water Claims Are “Insane”
Altman’s defense hinges on how modern hardware is actually kept from melting down. He correctly pointed out that evaporative cooling—which literally consumes massive amounts of water to chill ambient air—used to be the standard in data centers.
However, he argued that OpenAI’s newest facilities no longer rely on that older approach. Because modern AI clusters are aggressively shifting toward highly efficient direct-to-chip liquid cooling or closed-loop systems, Altman argues the viral “gallons per query” framing has “no connection to reality.”
🧑💻 The Human “Training” Defense and the Backlash
Altman paired his water rebuttal with a broader, much more controversial point about energy. He acknowledged that the total energy demand from AI is a fair concern, but argued it is misleading to compare the marginal cost of a single AI response with the cost of human learning.
He provocatively stated that “it takes a lot of energy to train a human,” describing 20 years of life, food consumption, and evolutionary survival as the real baseline if you want to compare “intelligence” on an energy basis. Measured that way, he argued, AI has already caught up in energy efficiency.
That “meat computer” analogy did not sit well with everyone. Zoho co-founder Sridhar Vembu publicly pushed back on the comparison, stating, “I do not want to see a world where we equate a piece of technology to a human being.” It highlights the philosophical friction that happens when Silicon Valley reduces the human experience to a mere energy-consumption metric.
🔍 The Transparency Gap and Stargate
So, is the water criticism “totally fake”? Only if you strictly define the criticism by that viral 17-gallon number.
The harder problem is that there is no single, universal water metric for AI. Water use depends heavily on where a data center is physically located, the local climate, the specific cooling architecture, and how the electricity feeding the facility is generated. Furthermore, as TechCrunch noted, there is currently no legal requirement for major AI companies to disclose their exact water and energy consumption, leaving independent researchers to infer the footprint from partial public signals.
To their credit, OpenAI is trying to shape that conversation with policy commitments rather than just rhetoric. In January 2026, OpenAI published its “Stargate Community” plan, detailing a massive multi-gigawatt infrastructure rollout across the U.S. That plan explicitly commits to minimizing water use by prioritizing closed-loop cooling systems, aiming to keep facility water usage at a tiny fraction of a local community’s overall consumption.
📝 The Bottom Line
Altman is right about one thing: the “17 gallons per query” meme is a poor way to understand the issue. It implies a fixed, universal cost that ignores the reality of modern infrastructure upgrades.
But critics are also right that AI’s environmental impact cannot be dismissed with a single soundbite. The real risk is scale. Even if the per-request water footprint is infinitesimally small (Altman previously claimed it was about 1/15th of a teaspoon), those fractions add up quickly when usage climbs into the billions of daily prompts, and when massive new data centers land in regions where power and water are already heavily contested.
If the AI industry wants this debate to cool down, it will need to do something it has historically resisted: publish consistent, auditable telemetry that separates cooling methods, water sources, and regional impacts. Viral rebuttals are fine for a stage in Mumbai, but hard data is what actually builds trust.
🔗 Where to Read More
- The Indian Express: Sam Altman at Express Adda
- TechCrunch: Sam Altman defends AI’s energy toll
- OpenAI: Stargate Community infrastructure plan
🖊️ About the Author
Chad Hembree is a certified network engineer with 30 years of experience in IT and networking. He hosted the nationally syndicated radio show Tech Talk with Chad Hembree throughout the 1990s and into the early 2000s, and previously served as CEO of DataStar. Today, he is based in Berea as the Executive Director of The Spotlight Playhouse, proof that some careers don’t pivot, they evolve.
