DeepSeek论文披露全新模型机制,SSD等存储需求有望再进一步,龙头还发布炸裂业绩
Seek .Seek .(US:SKLTY) Xuan Gu Bao·2026-01-13 23:24

Group 1 - DeepSeek introduced a new paper proposing "conditional memory" as a new dimension of sparsity to optimize large language models through the Engram module [1] - The existing Transformer architecture lacks a native knowledge retrieval mechanism, leading to inefficient simulation of retrieval behavior [1] - Conditional memory complements the MoE (Mixture of Experts) approach and significantly enhances model performance in knowledge retrieval, reasoning, coding, and mathematical tasks under equal parameters and computational conditions [1] Group 2 - The Engram module is a large, scalable embedding table that acts as an external memory for Transformers, allowing for efficient retrieval of nearby content [2] - Engram caches frequently accessed embeddings in faster storage mediums while storing less frequently accessed data in larger, slower storage, maintaining low access latency [2] - The NAND industry is expected to have limited capital expenditure over the next two years, with leading manufacturers likely to focus on HBM rather than NAND, while AI applications are anticipated to drive SSD demand [2] Group 3 - Baiwei Storage forecasts a net profit of 850 million to 1 billion yuan for the year, representing a year-on-year growth of 427.19% to 520.22% [2] - Jiangbolong has launched several high-speed enterprise-level eSSD products, covering mainstream capacities from 480GB to 7.68TB [3]