他们抛弃了HBM
3 6 Ke·2025-11-03 00:47

Group 1: AI and Storage Market Dynamics - The storage market is experiencing an unprecedented "super boom cycle" driven by the surge in computing power demand due to AI model training and inference, with HBM becoming a key component for AI servers [1] - Major storage companies like Samsung, SK Hynix, and Micron are witnessing explosive growth in profits, with Samsung's Q3 net profit increasing by 21%, SK Hynix achieving its highest quarterly profit ever, and Micron's net profit tripling year-on-year [1] - Traditional DRAM and NAND chips are also seeing increased demand as data center giants like Amazon, Google, and Meta are ramping up purchases to enhance AI inference and cloud service capabilities [1] Group 2: Qualcomm's AI Accelerators - Qualcomm is set to release its AI200 and AI250 data center accelerators in 2026 and 2027, designed to compete with AMD and NVIDIA's solutions for large-scale generative AI workloads [2] - The AI200 system will feature 768 GB of LPDDR memory and utilize PCIe for vertical scaling and Ethernet for horizontal scaling, with a power consumption of up to 160 kW per rack [4] - Qualcomm's approach of using LPDDR memory instead of expensive HBM indicates a potential shift in AI storage technology, emphasizing cost-effectiveness and efficiency [5] Group 3: Industry Trends and Innovations - The shift towards LPDDR memory by major chip manufacturers like NVIDIA and Intel reflects a broader industry adjustment, with predictions that inference workloads will outnumber training workloads by 100 times by 2030 [8] - LPDDR memory offers a cost advantage over HBM, with Qualcomm claiming a 13-fold cost-effectiveness, allowing large language model inference workloads to run directly in memory [10] - The introduction of LPDDR6, with data rates reaching 10,667 to 14,400 MT/s, marks a significant evolution in low-power memory technology, expected to be widely adopted in the near future [14][16] Group 4: Supply Chain Implications - The increasing demand for LPDDR memory in data centers may lead to a supply crisis affecting the consumer electronics market, as data center orders could overshadow smartphone manufacturers' needs [11] - The potential for higher memory costs and longer delivery times for smartphone manufacturers could result in compromises on memory configurations or increased prices for mid-to-high-end devices [12] - The transition from HBM to LPDDR in AI applications signifies a shift towards more cost-sensitive commercial deployments, impacting the pricing and availability of memory for consumer devices [18][20]