Most humans are wired to be polite, and to them, it doesn’t matter who they are communicating with. AI tools like ChatGPT have become a daily part of our lives, and even small extra words can have a significant impact. It uses computing power to understand and respond to messages sourced from large data centers. The longer or more detailed the message, the more energy it takes to process. Previously, one ChatGPT response used 2.9 watt-hours of electricity, but newer versions are more efficient, using around 0.3 watt-hours per response.
A December 2024 study by Future, publisher of TechRadar, found that 67% of US AI users are polite to it, with 71% polite users in the UK. The survey, which involved over 1,000 people, revealed that two-thirds of respondents are impolite to AI due to brevity, while 12% are polite out of fear of future consequences. This highlights the varying attitudes towards AI usage.
Hidden costs of being polite with AI
OpenAI CEO Sam Altman revealed that the company has lost significant amounts of money in electricity costs due to people showing good manners to their AI models. Altman responded to a user on X, stating that politeness has cost the company “tens of millions of dollars.”
A report from the Electric Power Research Institute estimates that asking ChatGPT a question costs roughly 10 times the energy that asking Google would demand. The advances in data center development since the 2020s have led to a drastic increase in CO2 emissions from Big Tech, with Google reporting a 48 percent higher emissions in 2023 than in 2019, primarily due to increased data center energy consumption and supply chain emissions.
AI models, like ChatGPT, use more electricity than conventional software since they perform enormous calculations whenever they are instructed. Over the course of a year, if one out of ten working Americans utilized GPT-4 once a week, the electricity needed would equal the electricity used by every residence in Washington, D.C., for almost three weeks. A single ChatGPT question uses enough electricity to run a lightbulb for roughly twenty minutes. Microsoft and Google, two of the largest AI investors in the world, are already disclosing significant increases in emissions; Google has reported a 48% increase since 2019. Massive heat generation from AI servers necessitates continuous cooling, frequently using water-intensive systems (eponline).
According to a University of California, Riverside study, using GPT-4 to generate 100 words can use up to three bottles of water. Experts caution that the effects of AI on the environment are just starting to become apparent as it permeates every aspect of daily life. AI may be responsible for 25% of America’s electricity use by 2030, according to a recent statement made by Rene Haas, CEO of semiconductor maker ARM Holdings, to The Wall Street Journal.
Positives of using polite language for ChatGPT
Despite these risks, many AI experts contend that treating chatbots with courtesy is crucial for the humans who are training them. According to research from Microsoft’s WorkLab, customers who use courteous language typically receive better, more helpful responses from AI programs like Copilot.
Kurtis Beavers of Microsoft Copilot highlights the value of employing fundamental manners when engaging with AI in order to produce polite and cooperative results. According to him, generative AI mimics the professionalism, precision, and clarity of human-provided prompts.
In addition to guaranteeing the same courtesy in return, being courteous to AI chatbots enhances their responsiveness and functionality. According to a 2024 Waseda University study, being impolite to AIs decreased efficiency by 30%, whilst being kind decreased errors and gave more information from many sources.