Core Viewpoint - Micron Technology expresses confidence in selling out its high-bandwidth memory (HBM) chips next year, which are crucial for artificial intelligence (AI) applications [2][3]. Group 1: HBM Market Dynamics - Micron's Chief Business Officer, Sumit Sadhana, announced significant progress in discussions regarding HBM supply for 2026, indicating confidence in selling all HBM inventory next year [2]. - The primary focus for Micron's supply next year will be on HBM3E (12-layer) and potentially HBM4 (sixth generation) [3]. - Micron and SK Hynix dominate the market for the leading product, 12-layer HBM3E, which holds a 90% share in the AI chip market [3]. Group 2: Competitive Landscape - Micron differentiates itself by highlighting its relationship with Nvidia, stating that it has already begun mass production of HBM3E [3]. - SK Hynix and Samsung are also in the race, with plans to launch HBM4 in the second half of this year, while Micron aims for next year [3][4]. - Micron's HBM4 will utilize the same 1β node production as HBM3E, which is considered mature and high-performing, contrasting with competitors who are exploring newer nodes [4]. Group 3: Pricing and Production Challenges - HBM4 is expected to double the I/O count compared to the previous generation, leading to a projected price increase of about 30%, reaching approximately $500 per unit [5]. - Negotiations between SK Hynix and Nvidia regarding HBM supply for 2026 have faced delays, raising concerns about finalizing contracts [5]. - Micron's HBM4 is positioned to leverage the established 1β process, while Samsung's approach may require additional validation due to its use of a newer 1c node [5].
美光HBM 4,伺机反超