The rise of AI reasoning models comes with a big energy tradeoff
Fortune·2025-12-05 21:56

Core Insights - Leading AI developers are increasingly focused on creating models that mimic human reasoning, but these models are significantly more energy-intensive, raising concerns about their impact on power grids [1][4]. Energy Consumption - AI reasoning models consume, on average, 30 times more power to respond to 1,000 prompts compared to alternatives without reasoning capabilities [2]. - A study evaluated 40 open AI models, revealing significant disparities in energy consumption; for instance, DeepSeek's R1 model used 50 watt hours with reasoning off and 7,626 watt hours with reasoning on [3][6]. - Microsoft's Phi 4 reasoning model consumed 9,462 watt hours with reasoning enabled, compared to 18 watt hours with it disabled [8]. Industry Concerns - The rising energy demands of AI have led to scrutiny, with concerns about the strain on power grids and increased energy costs for consumers; wholesale electricity prices near data centers have surged by up to 267% over the past five years [4]. - Tech companies are expanding data centers to support AI, which may complicate their long-term climate objectives [4]. Model Efficiency - The report emphasizes the need for understanding the evolving energy requirements of AI and suggests that not all queries necessitate the use of the most energy-intensive reasoning models [7]. - Google reported that its Gemini AI service's median text prompt used only 0.24 watt-hours, indicating a lower energy consumption than many public estimates [9]. Industry Leadership Perspectives - Tech leaders, including Microsoft CEO Satya Nadella, have acknowledged the need to address AI's energy consumption, emphasizing the importance of using AI for societal benefits and economic growth [10].

The rise of AI reasoning models comes with a big energy tradeoff - Reportify