DeepSeek发布梁文锋署名新论文
证券时报·2026-01-13 03:27
Core Viewpoint - DeepSeek released a new paper titled "Conditional Memory via Scalable Lookup: A New Axis of Sparsity for Large Language Models," which introduces conditional memory to enhance model performance in various tasks under equal parameters and computational conditions [1]. Group 1 - The paper was co-authored by Peking University and DeepSeek, with Liang Wenfeng listed as a co-author [1]. - Conditional memory is proposed to significantly improve model performance in knowledge retrieval, reasoning, coding, and mathematical tasks [1]. - DeepSeek has open-sourced a related memory module called Engram [1].