Workflow
UCM 统一记忆数据管理器
icon
Search documents
以存代算,华为存储开创大模型训推新范式
NORTHEAST SECURITIES· 2025-08-18 10:12
Investment Rating - The report rates the industry as "Better than the Trend" [7] Core Insights - The report emphasizes the importance of storage in enhancing the training and inference efficiency of large models, highlighting that storage optimization can significantly reduce training time and improve inference performance [3][17] - The shift towards inference as the core growth driver for computing power is noted, with increasing demand for diverse and long-context tasks [3][30] - Huawei's "Storage as Computation" approach is presented as a systematic solution to optimize performance through hardware and software integration [4][51] Summary by Sections 1. Storage Enhancements for Large Model Training and Inference - Storage plays a critical role in reducing data loading and checkpoint recovery times, potentially shortening training durations by 30% [18][21] - Inference performance can be significantly improved, with "Storage as Computation" reducing the first token latency by 90% and expanding context windows by over 10 times [24][27] 2. Transition to Inference-Centric Models - The report notes a surge in inference demand, with predictions indicating that by 2027, inference computing power will account for 70% of total demand [30][31] - The complexity of inference tasks is increasing, necessitating advanced storage solutions to manage longer contexts and higher concurrency [36][37] 3. Huawei's Systematic Approach - Huawei's AI SSDs are designed to handle both hot and cold data, with innovations in storage technology aimed at enhancing performance and capacity [4][52] - The UCM unified memory data manager is highlighted as a key component in optimizing inference efficiency [52] 4. Related Investment Opportunities - The report identifies several companies as potential investment targets, including Huawei storage agents and suppliers, as well as those involved in advanced packaging and SSD controller chips [5][6][4]