Workflow
AI内存市场
icon
Search documents
美光据报加速HBM4芯片扩产,月产能将提升至1.5万片
Ge Long Hui A P P· 2026-01-07 05:10
Core Viewpoint - Micron plans to increase its HBM4 monthly production capacity to 15,000 wafers by 2026, representing nearly 30% of its total HBM capacity of approximately 55,000 wafers per month, indicating a strategic focus on the next-generation AI memory market [1] Group 1: Production Capacity and Strategy - Micron has historically lagged behind Korean competitors in HBM capacity but is now changing this situation [1] - The CEO, Sanjay Mehrotra, announced during the December 2025 earnings call that significant HBM4 production increases will begin in the second quarter of 2026 [1] - The company expects the yield ramp-up speed for HBM4 to surpass that of the previous generation, HBM3E [1] Group 2: Investment and Development - Micron has initiated equipment investments to accelerate capacity construction [1]
三星存储:一个坏消息,一个好消息
半导体芯闻· 2025-06-13 09:41
Group 1 - Samsung Electronics is struggling with the mass production strategy for the next-generation NAND V10, with full-scale investment expected to be delayed until the first half of next year [1][2] - The V10 NAND features a stacking layer count of 430 layers, surpassing the current V9 generation, which has 290 layers [1] - The uncertainty in high-stacking NAND demand and the introduction of new technologies are hindering Samsung's development [1][2] Group 2 - Samsung is collaborating with major front-end equipment manufacturers like Lam Research and TEL to evaluate low-temperature etching equipment for the V10 NAND [2] - The assessment results indicate that low-temperature etching may not be immediately applicable for mass production, leading to a reevaluation of the equipment [2] - The investment costs associated with new equipment are a significant factor in Samsung's decision to postpone the V10 NAND mass production [2] Group 3 - Samsung has secured a supply agreement with AMD for the fifth-generation 12-layer HBM3E memory, which will be used in the upcoming MI350 AI accelerator [3][4] - The new 12-layer HBM3E offers over 50% improvement in performance and capacity compared to the previous 8-layer version, supporting bandwidth of up to 1,280GB/s [4] - AMD's upcoming MI400 series is expected to utilize Samsung's HBM4, which is seen as a critical battleground for dominance in the AI memory market [5] Group 4 - The HBM4 is anticipated to provide significant advantages for Samsung, especially as competitors are using fifth-generation 10nm technology while Samsung plans to adopt a more advanced sixth-generation process [5] - The Helios server architecture, which includes 72 MI400 GPUs, will have a total of 31TB HBM4, significantly enhancing AI processing capabilities [5]