存储黑科技来了
财联社·2026-02-12 06:24

Core Viewpoint - The article discusses the acceleration of global storage technology innovation driven by the increasing memory demands of AI computing, highlighting SK Hynix's new storage architecture called H³, which combines HBM and HBF technologies to significantly enhance performance in AI applications [3][6]. Group 1: H³ Architecture - SK Hynix introduced the H³ architecture at the IEEE global semiconductor conference, which integrates HBM and HBF technologies to improve performance [3]. - In simulations, the H³ architecture demonstrated a performance increase of up to 2.69 times per watt compared to using HBM alone when paired with NVIDIA's Blackwell GPU [3][5]. Group 2: Advantages in AI Inference - The H³ architecture is particularly advantageous for AI inference, where the key-value cache (KV cache) is essential for managing data interactions between AI services and users [5]. - By utilizing HBF for KV cache storage, the burden on GPU and HBM is reduced, allowing them to focus on high-speed computations and data creation [6]. Group 3: Industry Development - SK Hynix, Samsung, and SanDisk are advancing HBF technology, with SK Hynix planning to release HBF1 samples by the end of this year, featuring 16-layer NAND flash stacking [6]. - Samsung and SanDisk aim to implement HBF technology in actual products from NVIDIA, AMD, and Google by late 2027 or early 2028 [6]. Group 4: Market Potential - Current AI models require memory capacities that exceed HBM capabilities, with HBF expected to provide 8 to 16 times the storage capacity of existing HBM, potentially expanding GPU storage to 4TB [7]. - Niche storage markets are anticipated to grow due to AI demand, with SLC NAND likely to be used in AI SSD products and HBF, leading to sustained price increases in the niche storage sector [7].

存储黑科技来了 - Reportify