AI内存市场
Search documents
SK海力士关键一役!报道:HBM4最终样品即将交付,若通过英伟达认证本月即可量产
Hua Er Jie Jian Wen· 2026-03-10 10:49
Core Insights - SK Hynix is at a critical turning point in the HBM market, preparing to submit final samples of the sixth-generation high bandwidth memory (HBM4) to NVIDIA for certification, with potential mass production orders expected soon if successful [1][2] - The certification process is crucial not only for passing but also for product grading, as NVIDIA categorizes HBM products into high-end (Bin 1) and lower performance (Bin 2) tiers, which will impact SK Hynix's market position [2] Group 1: HBM4 Certification and Market Position - SK Hynix has faced challenges in the certification process due to compatibility issues with NVIDIA's Rubin GPU, which have now been resolved through multiple optimizations [2][3] - Samsung has gained a competitive edge by starting mass shipments of HBM4 to NVIDIA without redesign, putting pressure on SK Hynix's previous dominance in the HBM market, where it held over 90% share [3] - The outcome of the certification will determine whether SK Hynix can maintain its core supplier status with NVIDIA or if Samsung will take over as the main supplier for HBM4 [3] Group 2: Strategic Initiatives and Developments - SK Group Chairman Choi Tae-won is personally engaging in high-level diplomacy to support HBM4 sales, planning to meet with NVIDIA CEO Jensen Huang at the upcoming GTC 2026 conference [4] - SK Hynix has successfully developed a 16GB LPDDR6 mobile chip based on the 10nm 1c process, achieving over 33% speed improvement and over 20% energy efficiency compared to the previous generation, with plans for mass production later this year [5]
美光据报加速HBM4芯片扩产,月产能将提升至1.5万片
Ge Long Hui A P P· 2026-01-07 05:10
Core Viewpoint - Micron plans to increase its HBM4 monthly production capacity to 15,000 wafers by 2026, representing nearly 30% of its total HBM capacity of approximately 55,000 wafers per month, indicating a strategic focus on the next-generation AI memory market [1] Group 1: Production Capacity and Strategy - Micron has historically lagged behind Korean competitors in HBM capacity but is now changing this situation [1] - The CEO, Sanjay Mehrotra, announced during the December 2025 earnings call that significant HBM4 production increases will begin in the second quarter of 2026 [1] - The company expects the yield ramp-up speed for HBM4 to surpass that of the previous generation, HBM3E [1] Group 2: Investment and Development - Micron has initiated equipment investments to accelerate capacity construction [1]
三星存储:一个坏消息,一个好消息
半导体芯闻· 2025-06-13 09:41
Group 1 - Samsung Electronics is struggling with the mass production strategy for the next-generation NAND V10, with full-scale investment expected to be delayed until the first half of next year [1][2] - The V10 NAND features a stacking layer count of 430 layers, surpassing the current V9 generation, which has 290 layers [1] - The uncertainty in high-stacking NAND demand and the introduction of new technologies are hindering Samsung's development [1][2] Group 2 - Samsung is collaborating with major front-end equipment manufacturers like Lam Research and TEL to evaluate low-temperature etching equipment for the V10 NAND [2] - The assessment results indicate that low-temperature etching may not be immediately applicable for mass production, leading to a reevaluation of the equipment [2] - The investment costs associated with new equipment are a significant factor in Samsung's decision to postpone the V10 NAND mass production [2] Group 3 - Samsung has secured a supply agreement with AMD for the fifth-generation 12-layer HBM3E memory, which will be used in the upcoming MI350 AI accelerator [3][4] - The new 12-layer HBM3E offers over 50% improvement in performance and capacity compared to the previous 8-layer version, supporting bandwidth of up to 1,280GB/s [4] - AMD's upcoming MI400 series is expected to utilize Samsung's HBM4, which is seen as a critical battleground for dominance in the AI memory market [5] Group 4 - The HBM4 is anticipated to provide significant advantages for Samsung, especially as competitors are using fifth-generation 10nm technology while Samsung plans to adopt a more advanced sixth-generation process [5] - The Helios server architecture, which includes 72 MI400 GPUs, will have a total of 31TB HBM4, significantly enhancing AI processing capabilities [5]