Core Viewpoint - The article discusses the advancements in AI hardware, particularly focusing on NVIDIA's next-generation Rubin CPX, which optimizes AI inference workloads and enhances memory performance, suggesting a positive outlook for DRAM pricing and demand in the context of AI applications [1][2][3]. Group 1: NVIDIA's Next-Generation Hardware - NVIDIA's Rubin CPX architecture separates the computational load of AI inference, enhancing memory upgrades for faster data transmission [1][2]. - The new NVIDIA flagship AI server, NVL144 CPX, integrates 36 Vera CPUs, 144 Rubin GPUs, and 144 Rubin CPX GPUs, providing 100 TB of high-speed memory and 1.7 PB/s memory bandwidth [2]. - The performance of the Rubin CPX architecture can exceed the current flagship GB300 NVL72 by up to 6.5 times when handling large context windows [2]. Group 2: Market Trends and Opportunities - The demand for AI high-end chips is increasing, with suppliers launching new products, which is expected to drive up both volume and price for DRAM [3]. - The average capacity of Server DRAM is projected to grow by 17.3% year-on-year in 2024, reflecting the rising need for AI servers [3]. - The acquisition of Shenzhen Jintaike's storage business by Kaipu Cloud aims to strengthen its enterprise-level DDR capabilities, indicating strategic moves within the industry [3].
国泰海通|电子:下一代英伟达Rubin CPX内存升级
国泰海通证券研究·2025-09-11 14:05