Core Viewpoint - The article discusses the hidden energy costs associated with polite language in AI interactions, highlighting a global resource allocation dilemma as AI usage increases [6][8]. Group 1: Energy Consumption and AI - Each polite request in AI interactions, such as using "please" or "thank you," significantly increases energy consumption, with a single token processing requiring 0.0003 kWh [9][12]. - ChatGPT processes approximately 200 million requests daily, leading to an estimated annual energy consumption of 415 billion kWh for global data centers, enough to power Japan for 18 days [9][12]. - 40% of this energy is used for cooling systems, raising concerns about the environmental impact of AI technologies [9][14]. Group 2: Environmental Impact and AI Development - The article critiques claims from tech giants like Google and Microsoft that downplay the environmental impact of AI, arguing that the cumulative effect of billions of polite requests creates a significant ecological burden [11][12]. - In Virginia, data centers consume more electricity than the entire state's residential usage, causing local ecological damage, such as increased water temperatures leading to fish deaths [13][14]. Group 3: Solutions and User Behavior - Tech companies are exploring different strategies to mitigate energy consumption, such as OpenAI's $500 billion investment in new data centers and Meta's reduction of energy use in AI models [15][18]. - Research indicates that if users stopped using polite language, AI energy consumption could decrease by 18%, suggesting that user behavior plays a crucial role in energy efficiency [17][18]. - Innovations like "de-politeness" plugins and AI that anticipates user intent could further reduce unnecessary energy use in AI interactions [17][18].
一场关于AI能源消耗的隐秘战争
投中网·2025-09-06 07:04