Core Viewpoint - NVIDIA has introduced a context memory storage platform at CES 2026, aimed at creating a new memory layer optimized for AI inference, bridging GPU memory and traditional storage [1][2]. Group 1: NVIDIA's New Platform - The context memory storage platform is a POD-level AI-native storage infrastructure designed to support long-term AI operations by optimizing data access and management [2]. - The platform includes hardware and software components such as BlueField-4 for data management, Spectrum-X Ethernet for high-performance networking, and various software tools to enhance system efficiency [2]. Group 2: Shift in AI Inference Bottlenecks - The bottleneck in AI inference is shifting from computation to context storage, necessitating a restructured AI storage architecture to handle increased context data from complex tasks [3]. - There is an expectation that the demand for storage chips will grow significantly as AI evolves from simple chatbots to more complex collaborative entities requiring extensive context capacity [3]. Group 3: Domestic Storage Industry Opportunities - The ongoing supply-demand imbalance in storage, coupled with limited expansion from overseas storage giants, presents a historic opportunity for domestic storage manufacturers to increase their market share [4]. - Companies like Changxin Technology and Yangtze Memory Technologies are making significant advancements in DRAM and NAND technologies, respectively, which could lead to substantial production capacity increases post-IPO [4]. Group 4: Related Companies - Key domestic semiconductor equipment companies include Zhongwei Company, Jingzhida, and Beifang Huachuang, among others [5]. - Companies involved in packaging and testing include Deep Technology and Huicheng Shares [6]. - Firms focusing on AI storage solutions and benefiting from storage technology iterations include Zhaoyi Innovation and Lianyun Technology [6].
东方证券:英伟达(NVDA.US)推出推理上下文内存存储平台 AI存储需求持续扩张