Key Highlights
- Sam Altman’s comparison of chatbots and humans is criticized as a misanthropic PR tactic.
- The energy efficiency debate between human evolution and AI training is examined.
- Anti-Altman sentiment within the AI industry is highlighted, with Dario Amodei making similar statements.
- The implications of equating artificial intelligence to organic life are discussed.
Sam Altman’s Misstep: Training a Human vs. Chatbots
Feb 23, 2026. It’s another day in the AI summit circuit, and Sam Altman, CEO of OpenAI, is on stage in India. A reporter from The Indian Express asks about the energy demands of generative-AI models.
Altman’s response? “It takes a lot of energy to train a human,” he says, pushing back with an unlikely comparison.
He argues that chatbots are more efficient once trained, stating that their energy consumption is lower than even simple human brain usage for basic queries. This is a misstep. Altman’s claim overlooks the staggering amount of resources humans have consumed over millennia—food, water, and labor—not to mention the significant environmental impact.
Energy Consumption: A False Equivalence
The energy required by human brains is indeed lower than that of AI models for simple tasks. But comparing the two overlooks the broader context. Altman’s argument suggests that he views chatbots and humans as on equal footing, which is a common stance within the industry.
Climate Change and AI
The real issue isn’t about comparing energy use but about the increasing carbon footprint of data centers. Atmospheric CO2 levels are at unprecedented highs due to contemporary human activity, not ancient evolutionary processes. OpenAI’s Stargate data center is just one of many contributing to this problem.
Anthropic’s Similar Stance
Dario Amodei, CEO of Anthropic and Altman’s chief rival, made a similar comparison at the same summit. Both are convinced that AI models should be treated as if they were organic life forms. This mindset is concerning because it implies that AI firms believe either in building something akin to a god or using marketing spin.
The Ethical Implications
Equating AI training to human evolution and day-to-day learning is misplaced. It suggests that the industry has lost touch with what it means to be human. Training a human—living a life—is about struggle, acceptance of failure, and sometimes meandering in search of wonder.
Generative AI aims to eliminate these processes, making everything instant, efficient, and effortless.
These tools may serve us, but placing them on the same plane as organic life is troubling. Altman’s comparison hints at a deeper ethical issue: the industry’s disregard for human values and environmental responsibility.
The writing on the wall is clear. Sam Altman’s words might be designed to pacify investors, but they ring hollow in an age where climate change is a pressing concern. The industry needs to reassess its priorities before it truly becomes a force of nature, one that demands serious attention and regulation.