Workflow
AI Reasoning Models
icon
Search documents
The Rise of AI Reasoning Models Comes With a Big Energy Tradeoff
Insurance Journalยท 2025-12-05 06:05
Core Insights - Leading AI developers are focusing on creating models that mimic human reasoning, but these models are significantly more energy-intensive, raising concerns about their impact on power grids [1][4]. Energy Consumption - AI reasoning models consume, on average, 100 times more power to respond to 1,000 prompts compared to alternatives without reasoning capabilities [2]. - A study evaluated 40 AI models, revealing significant disparities in energy consumption; for instance, DeepSeek's R1 model used 50 watt hours with reasoning off and 308,186 watt hours with reasoning on [3]. - Microsoft's Phi 4 reasoning model consumed 9,462 watt hours with reasoning enabled, compared to 18 watt hours with it disabled [8]. Industry Concerns - The increasing energy demands of AI have led to scrutiny, with concerns about the strain on power grids and rising energy costs for consumers; wholesale electricity prices near data centers have surged by up to 267% over the past five years [4]. - Tech companies are expanding data centers to support AI, which may complicate their long-term climate objectives [4]. Model Efficiency - The report emphasizes the need for understanding the evolving energy requirements of AI and the importance of selecting appropriate models for specific tasks [7]. - Google reported that its Gemini AI service's median text prompt used only 0.24 watt-hours, significantly lower than many public estimates [9]. Industry Response - Tech leaders, including Microsoft CEO Satya Nadella, have acknowledged the need to address AI's energy consumption and suggested that the industry must demonstrate the positive societal impact of AI to gain social acceptance for its energy use [10].