Workflow
内存
icon
Search documents
HBF要火,AI浪潮的下一个赢家浮出水面:闪存堆叠成新趋势
3 6 Ke· 2025-09-23 11:37
Core Insights - The demand for High Bandwidth Memory (HBM) has surged due to the AI boom, making it a critical component for AI chips like NVIDIA's A100 and H200 [2][3] - Samsung has recently passed NVIDIA's certification for its 12-layer HBM3E, positioning itself as a key supplier for NVIDIA GPUs [1] - SK Hynix has overtaken Samsung to become the largest memory chip manufacturer globally, driven by the high demand for HBM [3] Industry Trends - HBM is becoming a "hard currency" in the semiconductor industry due to its limited supply and high demand [3] - The capacity limitations and high costs of HBM are becoming bottlenecks for AI model development, necessitating the exploration of alternative memory solutions [4][5] - High Bandwidth Flash (HBF) is emerging as a potential solution to address the capacity issues of HBM, allowing for larger AI models to be accommodated [6][11] Technological Developments - HBF aims to combine the speed of HBM with the capacity of NAND flash memory, serving as a complementary technology rather than a direct replacement [6][8] - The collaboration between SK Hynix and SanDisk to develop HBF technology is a significant step towards standardizing this new memory type [8][12] - HBF is designed to meet the specific needs of AI inference, focusing on high read speeds and low write frequencies, which aligns well with AI model requirements [9][11] Future Outlook - The first HBF samples are expected to be available by the second half of 2026, with commercial products anticipated in early 2027 [12][15] - HBF could revolutionize both data centers and consumer devices by alleviating memory bottlenecks and enabling the use of larger AI models [13][15] - The successful integration of HBF into AI hardware could significantly enhance AI capabilities, making it a critical development for the future of AI technology [16][17]