We’re spending millions to be polite to machines. Literally.
Sam Altman, the CEO of OpenAI, let it slip this week that people typing “please” and “thank you” while chatting with ChatGPT is racking up some expensive bills.
When someone on X (formerly Twitter) tossed out a half-joking question about how much those extra words are costing OpenAI in electricity, Altman responded directly: “Tens of millions of dollars well spent. You never know.”
No one’s sure if he was being flippant or philosophical. But it’s clear that being courteous to a machine isn’t as harmless as it seems.
While many think these small acts of digital politeness are sweet or habit-driven, they come with a price tag. Every word you type gets processed through massive data centres spread across the world, each of them drawing power like industrial factories. And those centres don’t blink at the environmental cost—because they can’t.
Think about this: a 100-word AI-generated email can consume 0.14 kilowatt-hours of electricity. That’s enough to keep 14 LED bulbs on for an hour. Multiply that by millions of users typing away every day, and the scale becomes dizzying.
A Washington Post investigation with researchers at the University of California found that even simple tasks using these AI tools come with surprising energy costs. Over a year, sending just one AI-generated email per week could burn through about 7.5 kilowatt-hours—the same as nine average homes in Washington DC running for an hour.
Altman may have brushed off the cost as worthwhile, but the bigger issue is there. Data centres behind AI platforms already account for roughly 2% of the world’s electricity usage.
That figure is climbing fast, especially as companies push AI into more corners of everyday life—from finance to education, customer service to entertainment.
Still, not everyone sees a problem with politeness. Kurtis Beavers, who leads design for Microsoft Copilot, believes language matters. “Using polite language sets a tone for the response,” he says.
A memo from Microsoft WorkLab backs that up: “When it clocks politeness, it’s more likely to be polite back.” The logic is simple: speak kindly to machines, and they’ll treat you kindly in return.
Some users clearly agree. A 2024 survey showed that 67% of Americans are deliberately polite to AI tools. Most said it’s just good manners. Some went further—about 12% admitted they’re being nice just in case the robots take over one day.
That sounds ridiculous until you realise how easily we anthropomorphise tech. We talk to machines like we talk to people, assign them personalities, and even apologise when we type something wrong. It’s no surprise, then, that some of us say “thank you” after getting a chatbot’s response—even when we know it doesn’t care.
But politeness alone doesn’t justify the energy drain. As the environmental toll grows, so does the urgency to rethink how we engage with these systems.
OpenAI, to its credit, has been exploring alternatives. They’re backing clean energy ideas like solar tech and even nuclear fusion in a bid to reduce their carbon footprint.
Yet that doesn’t solve today’s problem. We’re treating ChatGPT and other tools like digital companions, but the infrastructure behind them is anything but friendly. Every extra word costs power. Every polite gesture drains a bit more from the planet.
So next time you’re tempted to write “please” to ChatGPT, maybe pause. Is it necessary? Or are we just being polite to prove we’re still human?
Either way, someone’s paying the bill—and the Earth might be footing the tab.