广发证券:HBF未来或成为满足AI大模型内存容量要求的最佳方案
Zhi Tong Cai Jing·2025-12-18 08:57

Core Insights - The increasing model parameter count and context length in AI models are driving the demand for higher memory capacity, which current HBM technology struggles to meet [1][2] - Sandisk and SK Hynix are collaborating to develop a new storage product, HBF, which is expected to meet the memory capacity requirements for AI models, potentially offering 8-16 times the capacity of existing HBM [2][3] - Companies with database technology backgrounds, such as Alibaba and Huawei, have the potential to develop data infrastructure software based on HBF storage [3][4] Group 1: HBF Development and Capabilities - HBF (High Bandwidth Flash) is being developed to address the memory capacity needs of AI models, with a planned capacity expansion for GPUs up to 4TB [2] - The technology behind HBF aims to achieve high-speed interconnection with GPUs using BiCS and CBA wafer bonding processes [2][3] Group 2: Potential of Companies in Data Infrastructure Software - Companies like Huawei and Alibaba, along with independent firms such as Starburst Technology and PingCAP, are positioned to create data infrastructure software optimized for HBF storage [3][4] - The development of data infrastructure software is crucial for companies with significant data processing needs, particularly for their AI model inference tasks [3] Group 3: Impact of HBF Technology Maturity - The maturity of HBF technology is expected to drive the application of related data infrastructure software in AI inference tasks [4] - Starburst Technology's ArgoDB is an example of a distributed database optimized for flash storage, indicating the potential for further development in this area [4]