Core Viewpoint - The AI industry is facing power shortages, impacting the operation of AI models and data centers, as highlighted by Microsoft's CEO Satya Nadella, who noted that their data centers are nearing power and physical space limits, leading to many AI chips being unable to operate and remaining in storage [1]. Power Consumption in AI Industry - Model training is a significant power consumer, requiring vast amounts of data and high-performance computing. The training of OpenAI's GPT-3 model consumes approximately 1.287 GWh, equivalent to the annual electricity usage of 120 American households [4]. - In addition to model training, model inference also contributes to ongoing power consumption, with ChatGPT alone requiring 500,000 kWh daily to meet the demands of over 200 million users [6]. Auxiliary Facilities and Their Impact - Auxiliary facilities such as cooling systems and power supply systems are essential for the operation of AI models, consuming additional electricity and generating heat that necessitates cooling solutions [9]. Investment Opportunities in the Power Sector - Huatai Securities predicts that the AI narrative will accelerate the construction of the U.S. power system, leading to a delay in coal power phase-out and an increase in solar storage and solid oxide fuel cells (SOFC). They foresee a flourishing of various segments within the new energy sector during this transition [10]. - CITIC Securities emphasizes the long-term opportunities in ultra-high voltage, flexible DC transmission, and smart grid sectors, driven by the "14th Five-Year Plan." They expect a structural demand rebound in transmission and transformation equipment [12].
AI产业链,哪些环节是“耗电大户”?