Core Viewpoint - OpenAI's CEO Sam Altman disclosed specific energy consumption data for ChatGPT queries, revealing an average energy usage of 0.34 watt-hours per query, which has sparked significant discussion regarding the environmental impact of AI technology [1] Group 1: Energy Consumption Data - ChatGPT's average energy consumption per query is 0.34 watt-hours, equivalent to the energy used by an energy-saving light bulb for two minutes or approximately 1/15 of a teaspoon of water [1] - Independent research from Epoch.AI aligns closely with OpenAI's data, estimating GPT-4o's energy consumption at approximately 0.0003 kilowatt-hours per query, supporting the credibility of OpenAI's figures [2] - A more detailed study by a team led by Nidhal Jegham found varying energy consumption levels for different models, with GPT-4.1 nano at 0.000454 kilowatt-hours and GPT-4.5 at 0.03 kilowatt-hours for longer tasks [3] Group 2: Hardware and Operational Analysis - With an estimated daily query volume of 1 billion, OpenAI's total daily energy consumption would be around 340 megawatt-hours, suggesting the need for a significant server infrastructure [6] - The theoretical model posits that OpenAI would require around 3,200 servers, each capable of processing 4.5 queries per second, although this estimation has faced scrutiny regarding its feasibility [6] - Research indicates that while the theoretical server count may be debated, it is not entirely dismissible, as some experiments show varying token generation speeds [6] Group 3: Skepticism and Criticism - Some experts question the completeness of OpenAI's energy consumption data, suggesting it may only account for GPU server energy use and overlook other infrastructure costs [7] - Critics argue that the number of servers estimated by OpenAI may be insufficient to handle the global user demand, with some estimates suggesting the actual number of GPUs could be in the tens of thousands [8] - There are concerns regarding the lack of transparency in OpenAI's data, including missing context on what constitutes an "average query" and the absence of carbon emission data [9] Group 4: Industry Context and Implications - OpenAI's disclosure coincides with similar reports from Nvidia and Google, indicating a potential shift towards greater transparency in the tech industry regarding energy consumption [10] - The discussion around AI energy consumption often emphasizes the training phase, but the inference phase's energy use could surpass that of training in a relatively short time frame [10] - The ongoing debate raises questions about whether OpenAI's data represents a genuine effort towards transparency or merely a strategic move to mitigate climate impact concerns [11]
奥特曼声称ChatGPT单次查询仅耗电0.34瓦时,这个数据靠谱吗?
3 6 Ke·2025-06-17 12:27