Workflow
高带宽存储器 (HBM4)
icon
Search documents
HBM的大赢家
半导体芯闻· 2025-03-20 10:26
Core Viewpoint - SK Hynix has launched the sixth generation of high bandwidth memory (HBM4), which will be utilized in Nvidia's next-generation AI accelerators, showcasing a significant advancement in memory technology [1][2]. Group 1: HBM4 Development and Features - SK Hynix announced the introduction of HBM4, which offers over 2 TB/s bandwidth, capable of processing over 400 full HD movies in one second [1]. - HBM4 is reported to be over 60% faster than its predecessor HBM3E, with improvements in stability through better heat management and chip warping control [1][2]. - The company plans to start mass production of HBM4 12-layer products in the second half of 2024 and HBM4 16-layer products in 2026 [1]. Group 2: Market Position and Competition - SK Hynix holds a 65% share of the global HBM market, followed by Samsung at 32% and Micron at 3%, maintaining its position as the primary supplier for Nvidia's latest AI chips [2]. - The competition among suppliers like SK Hynix, Samsung, and Micron is intensifying as they accelerate the development of HBM technology to meet the growing demand for AI applications [2]. Group 3: Technological Advancements - The development of the sixth generation DDR5 DRAM technology is expected to enhance HBM performance, with a focus on reducing power consumption and improving memory efficiency [3][4]. - SK Hynix aims to leverage the advancements in DRAM technology to increase HBM capacity while maintaining chip size, which will positively impact thermal management [4].