NVIDIA推理上下文记忆存储平台
Search documents
国泰海通|电子:存储产业链的“通胀”投资机会
国泰海通证券研究· 2026-01-09 13:28
Core Viewpoint - The introduction of NVIDIA's inference context memory storage platform is expected to intensify the supply tightness in the storage market, with NAND and DRAM contract prices projected to increase significantly in Q1 2026 [1][3]. Group 1: NVIDIA's Impact on Storage - NVIDIA's new AI platform, Rubin, integrates six chips and is designed to enhance storage capacity, achieving a fivefold increase in long-context inference performance, total cost of ownership (TCO), and energy efficiency [2]. - The AI-native storage platform, NVIDIA's inference context memory storage, serves as a key-value (KV) cache layer, further driving the demand for storage solutions [2]. Group 2: Market Predictions - According to TrendForce, the global server market is expected to peak in 2026, leading to increased demand for Enterprise SSDs, which will become the largest application for NAND Flash [3]. - The limited production capacity of suppliers is expected to deepen the supply tightness, resulting in a projected increase of 55%-60% for general DRAM contract prices and 33%-38% for NAND prices in Q1 2026 [3]. Group 3: Longsys Technology's IPO and Growth - Longsys Technology's IPO is anticipated to further expand its capital expenditures, as it is the largest and most advanced DRAM R&D and manufacturing enterprise in China [4]. - The company has successfully transitioned from first-generation to fourth-generation process technology and has a comprehensive product range from DDR4 to DDR5 [4]. - Longsys Technology's fixed asset investments from 2022 to the first half of 2025 are projected to increase significantly, with a planned fundraising of 29.5 billion yuan through its IPO [4].
国泰海通证券:存储超级大周期正在上演 关注相关半导体设备、材料公司:存储产业链的“通胀”投资机会
Xin Lang Cai Jing· 2026-01-09 09:27
Core Insights - NVIDIA has launched the Rubin AI platform and inference context memory storage platform at CES 2026, driving the demand for storage capacity growth [1][5] - Changxin Technology has disclosed its prospectus for an IPO on the Sci-Tech Innovation Board, aiming to raise 29.5 billion yuan, marking a new development phase for China's storage industry [1][4] Group 1: NVIDIA's Innovations - The introduction of NVIDIA's inference context memory storage platform significantly enhances long-context inference performance, achieving a fivefold increase in tokens per second, total cost of ownership (TCO) performance, and energy efficiency [2][6] - The Rubin AI platform integrates six chips and is now in full production, including components such as Rubin GPU, Vera CPU, NVLink 6, Spectrum-X Ethernet Photonics, ConnectX-9 SuperNIC, and BlueField-4 DPU [1][5] Group 2: Market Trends and Predictions - TrendForce predicts a significant increase in storage contract prices, with general DRAM contract prices expected to rise by 55%-60% quarter-on-quarter in Q1 2026, and NAND prices expected to increase by 33%-38% [3][7] - The global server market is anticipated to reach a growth peak in 2026, driving demand for Enterprise SSDs, which are expected to become the largest application for NAND Flash [3][7] Group 3: Changxin Technology's Development - Changxin Technology is the largest and most advanced DRAM R&D and manufacturing enterprise in China, having completed mass production from the first to the fourth generation of process technology platforms [4][8] - The company operates three 12-inch DRAM wafer fabs in Hefei and Beijing, with significant capital expenditures on fixed and long-term assets from 2022 to mid-2025, totaling 1.744 billion yuan [4][8] - The IPO aims to raise 29.5 billion yuan, which is expected to further expand capital expenditures if successful [4][8]
国泰海通:存储超级大周期正在上演 关注相关半导体设备/材料公司
智通财经网· 2026-01-09 06:51
Group 1 - The current AI-driven storage supercycle is expected to have strong sustainability, with the IPO filing of Changxin Technology indicating progress towards a successful listing on the Sci-Tech Innovation Board, benefiting domestic semiconductor equipment and materials companies [1] - Investment directions include focusing on semiconductor equipment/material companies with high revenue exposure to the storage industry and those with potential breakthroughs in domestic substitution related to the storage supply chain [1] - NVIDIA's introduction of the inference context memory storage platform is driving growth in storage capacity, enhancing performance metrics significantly [1] Group 2 - A significant increase in contract prices for NAND and DRAM is anticipated in Q1 2026, driven by a peak in global server market growth and limited supplier capacity [2] - The expected price increase for general DRAM contracts is projected to be 55%-60% quarter-on-quarter, while NAND prices are expected to rise by 33%-38% [2] Group 3 - If Changxin successfully lists on the Sci-Tech Innovation Board, it is expected to further expand its capital expenditures, being the largest and most advanced DRAM R&D and manufacturing enterprise in China [3] - The company has made substantial investments in fixed and long-term assets over recent years, with planned fundraising of 29.5 billion yuan from the IPO [3]
黄仁勋:AI彻底改变计算堆栈,现在,存储也加入了这场变革丨直击CES
Xin Lang Cai Jing· 2026-01-06 02:01
Core Insights - NVIDIA announced the BlueField-4 data processor at CES 2026, which supports the NVIDIA inference context memory storage platform, designed as a new AI-native storage infrastructure for the forefront of AI [1][4] - The platform addresses the need for scalable infrastructure to store and share context data generated by AI models, which can expand to trillions of parameters and multi-step reasoning [1][4] - The NVIDIA inference context memory storage platform enhances GPU memory capacity for context memory, enabling high-speed sharing across nodes and improving token processing and energy efficiency by up to 5 times compared to traditional storage [5] Company Developments - NVIDIA's CEO Jensen Huang emphasized that AI is transforming the entire computing stack, including storage, moving beyond simple interactions to intelligent collaborative partners capable of long-term reasoning and memory [2][5] - The platform increases the capacity of key-value (KV) caches and accelerates context sharing between large-scale AI system clusters, improving response times and throughput for multi-turn AI agents [2][5]