微流体技术
Search documents
全球AI功耗正在迅速失控
半导体芯闻· 2025-09-28 09:47
Core Insights - By 2030, the energy consumption of AI racks is expected to be 20 to 30 times that of traditional racks, with individual AI racks potentially consuming up to 1MW of power [1][4]. Group 1: Energy Consumption and Capacity - The average power capacity of data center racks is projected to rise to 30-50kW, reflecting an increase in computing density, particularly in comparison to AI workloads [2]. - The energy demands of AI racks will necessitate new requirements for power delivery and cooling infrastructure [3]. Group 2: Cooling Solutions - Cooling has become a central focus in the industry due to increased computing density and AI workloads, with a growing interest in liquid cooling methods [3]. - Current cooling methods, such as cooling plates, have limitations, prompting companies like Microsoft to explore microfluidic technology for more efficient cooling solutions [5]. Group 3: Industry Collaboration and Innovation - There is a notable increase in collaboration among manufacturers, engineers, and end-users to address complex cooling challenges [5]. - Smaller operators may find competitive opportunities in the market due to potential delivery bottlenecks faced by larger operators, emphasizing agility and innovation as key advantages [6].