Workflow
模型记忆
icon
Search documents
中金::人工智能十年展望):越过“遗忘”的边界,模型记忆的三层架构与产业机遇
中金· 2026-02-24 14:20
证券研究报告 2026.02.11 人工智能十年展望(二十七):越过"遗忘" 的边界,模型记忆的三层架构与产业机遇 SAC 执证编号:S0080518070011 SFC CE Ref:BOP246 于钟海 分析员 韩蕊 分析员 王之昊 分析员 SAC 执证编号:S0080523070010 SFC CE Ref:BXD683 rui.han@cicc.com.cn SAC 执证编号:S0080522050001 SFC CE Ref:BSS168 zhihao3.wang@cicc.com.cn 纵轴:相对值(%) 88 100 112 124 136 148 2025-02 2025-05 2025-08 2025-10 2026-01 沪深300 中金软件及服务 投资建议 大模型的演进史,本质上是一部与"遗忘"抗争的历史。当我们惊 叹于模型的推理能力时,往往忽视了一个重要短板:在缺乏记忆留 存的架构下,模型每一次对历史信息的处理,本质上都是一次昂贵 的"重复计算"。这种以高昂算力对抗遗忘的粗放模式,正面临着显 存墙与上下文窗口的物理极限。我们认为,2026 年及之后的AI Infra主战场将增加"模型记 ...
中金 | AI十年展望(二十七):越过“遗忘”的边界,模型记忆的三层架构与产业机遇
中金点睛· 2026-02-12 23:36
中金研究 大模型的演进史,本质上是一部与"遗忘"抗争的历史。 当我们惊叹于模型的推理能力时,往往忽视了一个重要短板: 在缺乏记忆留存的架构下,模型 每一次对历史信息的处理,本质上都是一次昂贵的"重复计算"。 这种以高昂算力对抗遗忘的粗放模式,正面临着显存墙与上下文窗口的物理极限。我 们认为,2026年及之后的AI Infra主战场将增加"模型记忆"这一极。 何为模型记忆?如何理解短期、中期、长期记忆三层记忆系统对应的软硬件需求? 如何对应模型训练、推理、Agent场景理解记忆分层系统?我们将在本报告中予以解答。 点击小程序查看报告原文 Abstract 摘要 短期记忆构成大模 型单 次推理的"当前视野"。 作为高频读写、对延迟极度敏感的"热数据",其核心矛盾在于KV Cache对显存容量与带宽的双重挤占。软 件端通过PagedAttention显存虚拟化与PD分离调度进行优化,并探索出无限注意力(Infini-attention)等前沿架构以支撑百万Tokens上下文窗口。这一逻辑 直接锚定了HBM与片上SRAM作为突破"显存墙"与"延迟墙"的重要硬件要素。 中 期记忆保障跨会话的情景连续性,是Agent的基 ...
中金:人工智能十年展望:2026关键趋势之模型技术篇
中金· 2026-02-11 05:58
Investment Rating - The report maintains a positive outlook on the AI industry, particularly focusing on advancements in large model technologies and their applications in various productivity scenarios [2][3]. Core Insights - In 2025, global large model capabilities advanced significantly, overcoming challenges in reasoning, programming, and multimodal abilities, although issues like stability and hallucination rates remain [2][3]. - Looking ahead to 2026, breakthroughs in reinforcement learning, model memory, and context engineering are anticipated, moving from short context generation to long reasoning chain tasks and from text interaction to native multimodal capabilities [2][3][4]. - The scaling law for pre-training is expected to continue, with flagship models achieving higher parameter counts and intelligence limits, driven by advancements in NVIDIA's GB series chips and the adoption of more efficient model architectures [3][4]. Summary by Sections Model Architecture and Optimization - The report emphasizes the continuation of the Transformer architecture, with a consensus on the efficiency of the Mixture of Experts (MoE) model, which balances performance and efficiency [40][41]. - Various attention mechanisms are being optimized to enhance computational efficiency, with a focus on hybrid approaches that combine different types of attention for better performance [49][50]. Model Capabilities - The report highlights significant improvements in reasoning, programming, agentic capabilities, and multimodal tasks, indicating that large models have reached a level of real productivity in various fields [13][31]. - The ability of models to perform complex reasoning tasks has improved, with the introduction of interleaved thinking chains allowing for seamless transitions between thought and action [24][28]. Market Dynamics - The competition among leading global model manufacturers remains intense, with companies like OpenAI, Anthropic, and Gemini pushing the boundaries of model intelligence and exploring AGI [31][32]. - Domestic models are catching up, maintaining a static gap of about six months behind their international counterparts, with significant advancements in capabilities [32][33]. Future Outlook - The report anticipates that the introduction of continuous learning and model memory will address the "catastrophic forgetting" problem, enabling models to adapt dynamically based on task importance [4][5]. - The integration of high-quality data and large-scale computing resources is crucial for enhancing the capabilities of reinforcement learning, which is expected to play a key role in unlocking advanced model functionalities [3][4].
每日投行/机构观点梳理(2026-02-05)
Jin Shi Shu Ju· 2026-02-05 12:26
Group 1: Gold and Silver Market Outlook - A Reuters survey indicates that gold prices are expected to reach a new high of $4,746.50 per ounce by 2026, driven by geopolitical uncertainties and strong central bank purchases, marking a significant increase from last year's forecast of $4,275 [1] - The average price expectation for silver in 2026 has also been raised to $79.50 per ounce, up from $50 in the previous year's survey [1] Group 2: Currency and Economic Analysis - The strong US dollar is exerting downward pressure on gold and silver prices, with analysts suggesting that if the dollar's rebound continues, it may further impact gold prices negatively [2] - UBS forecasts a 10% increase in global stock markets by the end of the year, with a focus on diversification into markets like China, Japan, and Europe, driven by strategic autonomy and fiscal expansion [3] - Mitsubishi UFJ reports that the Japanese yen has fallen to a near two-week low due to election expectations, with potential for continued selling pressure as confidence in the ruling party's stability grows [4] - Goldman Sachs warns of upward fiscal risks in Japan ahead of the upcoming elections, suggesting that unless the Bank of Japan accelerates interest rate hikes, the yen may weaken further [6] Group 3: Sector-Specific Insights - Zhongtai Securities expresses a positive outlook on the raw material pharmaceutical sector, highlighting innovations in small nucleic acids and ADC toxins as catalysts for growth [7] - CITIC Securities recommends focusing on automotive companies with strong cost transfer capabilities and global layouts, as rising raw material prices are expected to pressure profit margins in the first quarter of 2026 [8] - Galaxy Securities identifies two main paths for AI-driven benefits: enhancing platform efficiency and improving production efficiency through content and tools, suggesting a focus on internet stocks and AI-related applications [9]
中金:2026年大模型将取得更多突破 向实现AGI长期目标更进一步
Zhi Tong Cai Jing· 2026-02-05 01:39
智通财经APP获悉,中金发布研报称,2025年全球大模型技术能力向前演进,逐步攻克生产力场景,在 推理、编程、Agentic以及多模态等能力方向取得明显进步,但模型通用能力在稳定性、幻觉率等方面 仍存在短板。展望2026年,该行认为大模型在强化学习、模型记忆、上下文工程等方面将取得更多突 破,从短context生成到长思维链任务,从文本交互到原生多模态,并向实现AGI长期目标更进一步。 中金主要观点如下: 强化学习重要性提升,成为解锁模型高级能力的关键 强化学习的引入提高了模型的智能上限,让模型可以更有逻辑、更符合人类偏好进行思考和推理,其本 质是"自我生成数据+多轮迭代",强化学习的关键在于大规模算力+高质量数据。海外OpenAI、Gemini 等模型厂商对于强化学习非常重视,国内DeepSeek、阿里千问等也在跟进,该行预计2026年海内外模 型厂商强化学习占比将进一步提升。 持续学习、模型记忆、世界模型等新路线将迎来核心突破 持续学习和模型记忆本质上是解决大模型"灾难性遗忘"问题,让模型具备选择性记忆机制。Google提出 的Titans、MIRAS、Nested Learning等算法和架构核心是让模 ...
中金 | AI十年展望(二十六):2026关键趋势之模型技术篇
中金点睛· 2026-02-04 23:52
Core Insights - The article discusses the advancements in large model technology, highlighting improvements in reasoning, programming, agentic capabilities, and multimodal abilities, while also noting existing shortcomings in general reliability and memory capabilities [1][4]. Model Architecture and Optimization - The Transformer architecture continues to dominate, with a consensus on the efficiency of the Mixture of Experts (MoE) model, which activates only a subset of parameters, significantly reducing computational costs [17][18]. - The industry is exploring various attention mechanisms to balance precision and efficiency, including Full-Attention, Linear-Attention, and Hybrid-Attention [20]. Model Capabilities - Significant progress has been made in reasoning, programming, agentic tasks, and multimodal applications, with models achieving real productivity levels in various domains [3][4]. - The introduction of reinforcement learning is crucial for unlocking advanced model capabilities, allowing for more logical reasoning aligned with human preferences [2][23]. Competitive Landscape - Major players like OpenAI, Gemini, and Anthropic are intensifying their competition, with OpenAI focusing on enhancing reasoning and multimodal integration, while Gemini has made significant strides in model capabilities and is leveraging high-quality data for improvements [11][42][43]. - Domestic models are catching up, maintaining a static gap of about six months behind their international counterparts, with companies like Alibaba and ByteDance producing competitive models [12][14]. Future Directions - The focus for 2026 includes further advancements in reinforcement learning, continuous learning, and world models, with expectations for models to tackle more complex tasks and achieve long-term goals like AGI [27][40]. - Continuous learning and model memory are seen as essential for achieving lifelong learning capabilities, with new algorithms like MIRAS and HOPE being pivotal in this evolution [28][32].
谷歌微软的天使投资人押注,这家GEO公司用模型记忆提升AI可见性丨涌现新项目
Sou Hu Cai Jing· 2026-01-23 00:21
Core Insights - geoSurge is a startup focused on Generative Engine Optimization (GEO), utilizing "model internal memory + RAG (Retrieval-Augmented Generation)" to provide GEO services, headquartered in London and founded in April 2025 [1] - The company has completed Pre-Seed funding supported by European venture capital firm Passion Capital and American venture fund Tuesday Capital, with angel investors from companies like Google DeepMind and Microsoft AI [1] Team Overview - CEO Francisco Vigo has 12 years of experience in business data analysis and previously served as Chief Data Scientist at fintech unicorn Zilch [2] - CTO Jons Mostovojs is an expert in machine learning and systems engineering, focusing on model training and infrastructure [4] - APAC Head Zoe Li is a former early-stage AI/DeepTech venture capitalist in Europe [4] Product Offerings - geoSurge's products include three main components: 1. **MEASURE**: Monitors a brand's current ranking in major AI systems like ChatGPT, tracking mentions, frequency, and consistency across time and markets [5] 2. **EXPLORE**: Helps clients understand the reasons behind their performance and provides optimization directions by analyzing model behavior and probability distributions [6] 3. **BOOST**: Enhances brand visibility in AI through corpus engineering, optimizing the model's information set to ensure accurate recognition and recall of brand information [10] Market Context - In September 2025, OpenAI's research indicated that 49% of ChatGPT usage is for inquiries, with about 70% of consumers using it for non-work-related purposes, highlighting the importance of AI-generated content for businesses [12] - GEO is fundamentally more complex than SEO, as it involves understanding AI systems' training and data collection processes, which are often opaque [16] Challenges and Opportunities - Brands face the risk of "disappearing" from AI recognition due to unstable memory and model updates, which can alter associations and recommendations [17] - Many GEO solutions focus on measurement and monitoring, but geoSurge emphasizes enhancing model memory for long-term visibility [17] - The company aims to combine GEO and traditional SEO strategies to optimize brand exposure effectively [18] Industry Trends - GEO was recognized as one of the top AI buzzwords in 2025 by MIT Technology Review, indicating a paradigm shift in branding and marketing [19] - The GEO market is still in its early stages, with various companies adopting different approaches, but geoSurge stands out by focusing on optimizing model memory for stable brand recognition [19] Performance Metrics - Key performance indicators for GEO effectiveness include real click-through rates from LLMs and the frequency of AI crawler activities, which are closely linked to a brand's inclusion in model training datasets [20]
清华唐杰:领域大模型,伪命题
量子位· 2025-12-26 08:52
Group 1 - The core idea is that scaling foundational models through pre-training is essential for AI to acquire world knowledge and basic reasoning capabilities [4][5] - More data, larger parameters, and saturated computation remain the most efficient methods for scaling foundational models [5] - The concept of domain-specific large models is considered a false proposition, as true AGI (Artificial General Intelligence) has not yet been achieved [28][30] Group 2 - Enhancing reasoning capabilities and aligning long-tail abilities are crucial for improving real-world AI performance [6][7] - The introduction of agents marks a significant milestone in AI, allowing models to interact with real environments and generate productivity [10][11] - Implementing memory mechanisms in models is essential for their application in real-world scenarios, with different memory stages mirroring human memory [12][13] Group 3 - Online learning and self-evaluation are key components for models to improve autonomously, with self-assessment being a critical aspect of this process [14][15] - The integration of model development and application is becoming increasingly important, with the goal of replacing human jobs through AI [16][17] - The future of AI applications should focus on enhancing human capabilities rather than merely creating new applications [32][34] Group 4 - Multimodal capabilities are seen as promising, but their contribution to AGI's upper intelligence limit remains uncertain [21][22] - The development of embodied AI faces challenges, including data acquisition and the stability of robotic systems [25][26] - The existence of domain models is driven by enterprises' reluctance to fully embrace AI, aiming to maintain a competitive edge [29][31]