Remember when mom insisted on saying “please” and “thank you”? Turns out those golden rules of etiquette are now golden in the most literal sense. OpenAI CEO Sam Altman recently revealed that polite interactions with ChatGPT users saying things like “please” and “thanks” are costing the company tens of millions of dollars in additional energy expenses due to longer responses. “Tens of millions of dollars well spent, you never know,” Altman joked on X (formerly Twitter) in April 2025. Around the same time, OpenAI also made headlines for acquiring the Chat.com domain for over $15 million, signaling just how seriously the company is investing in shaping how the world talks to AI.
Much as a sports car burns extra fuel during unnecessary acceleration, each additional token (what AI systems call word fragments) requires computational power. That processing translates directly into electricity consumption – turning “Could you please tell me the weather, thank you” into the digital equivalent of leaving all your lights on while nobody’s home.
The Backend of Being Nice
Behind ChatGPT’s sleek interface lies a computational beast with an appetite for power that would make Godzilla look environmentally conscious. Every token processed requires server activity, which demands electricity, and when multiplied across billions of daily interactions, those courtesy words stack up faster than streaming subscriptions after free trial periods end.
The impact of these linguistic niceties extends beyond mere operational costs. Kurtis Beavers, director of Microsoft’s Copilot design team, noted in a Microsoft WorkLab memo that “using polite language sets a tone for the response.” This creates a fascinating tension between computational efficiency and effective AI interactions.
Politeness: Insurance Against the Robot Uprising?
A February 2024, approximately 67% of American users employ polite language when interacting with AI. The reasons vary, but about 12% admitted they remain courteous out of concern for potential future consequences – essentially taking out an insurance policy against the technological singularity. The phenomenon reflects our tendency to anthropomorphize technology, treating sophisticated language models with the same social courtesies we extend to humans.
The Unexpected Carbon Footprint of Courtesy
Being polite to AI carries an environmental cost that would make Captain Planet do a double-take. Based on ChatGPT’s reported annual electricity consumption of 226.8 GWh, industry analysts estimate that processing a year’s worth of courtesy phrases constitutes a significant portion of this energy use, possibly the first instance in history where good manners are actively harming the planet.
Some studies widely cited in industry reports, indicate that training a single large language model produces carbon emissions equivalent to five cars’ lifetime output. While day-to-day operations consume less power than initial training, the scale of global interactions means every superfluous word counts in the climate ledger.
Finding Balance in the AI Era
For now, OpenAI absorbs these costs while exploring energy-efficient approaches, as outlined in their sustainability reports. The company has not suggested that users should change their behavior, leaving the choice between computational efficiency and human courtesy in users’ hands.
As AI weaves itself deeper into the fabric of daily life, this tension between human communication patterns and computational efficiency raises important questions about sustainable practices. Will good manners become a luxury in our AI interactions, or will technology adapt to accommodate our social norms?
After all, courtesy might just be our best investment in the AI age. Because if science fiction has taught us anything, it’s that the robots remember everything – including who bothered to say “please” and “thank you.”