Core Insights - Nvidia's next-generation AI accelerator, Vera Rubin, is shaping its high-bandwidth memory (HBM) supply landscape with Samsung Electronics and SK Hynix being selected as suppliers [1] - The competition for HBM4 supply is intensifying, with Nvidia setting performance requirements that exceed industry standards [2] - Micron is positioning itself differently in the HBM market, focusing on mid-range AI accelerators rather than flagship models [3] Group 1: Supply Chain Developments - Samsung and SK Hynix are expected to start mass production of HBM4 as early as this month, coinciding with Nvidia's upcoming GTC conference [1] - SK Hynix is projected to handle over half of Nvidia's total HBM supply, including HBM3E, by 2026, while Samsung is expected to dominate the HBM4 supply for Vera Rubin [1] - Samsung has already begun HBM4 shipments since February, while SK Hynix has not yet announced its delivery plans, raising market concerns [3] Group 2: Technical Specifications - Nvidia has set a data rate requirement of over 10 Gb/s for HBM4 used in Vera Rubin, surpassing the JEDEC standard of 8 Gb/s [2] - Samsung has successfully passed two levels of qualification testing for HBM4 at data rates of 10 Gb/s and 11 Gb/s, while SK Hynix is still optimizing its products for the higher level [2] - Vera Rubin is expected to feature 16 HBM4 stacks, achieving a total capacity of 576 GB, which exceeds the 432 GB limit of AMD's upcoming MI450 [2] Group 3: Market Dynamics - The significant rise in ordinary DRAM prices is reshaping the strategic considerations of HBM suppliers, providing Samsung with additional negotiation leverage [4] - The profitability gap between HBM and ordinary DRAM is narrowing, prompting manufacturers to adjust their production capacity allocations [4] - Samsung's ability to produce both HBM4 and ordinary DRAM allows it to present diversified pricing options to Nvidia, enhancing its negotiating position [5]
三星和SK海力士被选为英伟达Rubin HBM4供应商,预计三月开始出货