The question of whether to be polite to artificial intelligence may seem a moot point — it is artificial, after all. But Sam Altman, the chief executive of the artificial intelligence company OpenAI, recently shed light on the cost of adding an extra “Please!” or “Thank you!” to chatbot prompts. Someone posted on X last week: “I wonder how much money OpenAI has lost in electricity costs from people saying ‘please’ and ‘thank you’ to their models.
” The next day, Mr. Altman : “Tens of millions of dollars well spent — you never know.” First thing’s first: Every single ask of a chatbot costs money and energy, and every additional word as part of that ask increases the cost for a server.
Neil Johnson, a physics professor at George Washington University who has studied artificial intelligence, likened extra words to packaging used for retail purchases. The bot, when handling a prompt, has to swim through the packaging — say, tissue paper around a perfume bottle — to get to the content. That constitutes extra work.
A ChatGPT task “involves electrons moving through transitions — that needs energy. Where’s that energy going to come from?” Dr. Johnson said, adding, “Who is paying for it?” We are having trouble retrieving the article content.
Please enable JavaScript in your browser settings. Thank you for your patience while we verify access. If you are in Reader mode please exit and your Times account, or for all of The Times.
Thank you for your patience while we verify access. Already a subscriber? . Want all of The Times? .
.
Technology
Saying ‘Thank You’ to ChatGPT Is Costly. But Maybe It’s Worth the Price.

Adding words to our chatbot can apparently cost tens of millions of dollars. But some fear the cost of not saying please or thank you could be higher.