CXL互连芯片
Search documents
存储是Tokens的积分,产业链空间广阔
GF SECURITIES· 2025-12-14 05:49
Investment Rating - The industry investment rating is "Buy" with a previous rating of "Buy" as well [2]. Core Viewpoints - The storage sector is crucial for AI inference, driving rapid growth in storage demand, particularly for HBM, DRAM, and SSD, characterized by decreasing costs and increasing capacities [5][13]. - AI-driven storage demand is expected to surge, with projections indicating a need for hundreds of exabytes (EB) of storage capacity in the near future [5][24]. - The report emphasizes the broad space within the industry chain, highlighting opportunities in eSSD, MRDIMM, SPD, and VPD chips, as well as CXL storage pooling [5][79]. Summary by Sections 1. Storage as Tokens for AI Inference - AI servers utilize various storage types, including HBM, DRAM, and SSD, with a focus on high bandwidth and large capacity to support efficient data processing [13][17]. - The demand for SSD and HDD is projected to grow significantly, with estimates suggesting a requirement of 49 EB for ten Google-level inference applications by 2026 [24]. 2. AI-Driven Storage Demand Growth - eSSD is identified as a core demand area for AI and storage servers, with increasing needs for high bandwidth and large capacity due to long-context inference and RAG databases [25][26]. - The market for AI server eSSD is expected to expand, with theoretical maximum capacities of 59 EB, 89 EB, and 120 EB for 2024, 2025, and 2026 respectively [27][34]. 3. MRDIMM Applications - MRDIMM is anticipated to enhance performance in large model inference, providing significant bandwidth improvements and capacity expansions [38][39]. 4. SPD and VPD Chip Opportunities - The transition to DDR5 memory modules presents growth opportunities for SPD and VPD chips, driven by increased specifications and demand [45][46]. 5. CXL Storage Pooling - CXL technology facilitates storage pooling, enhancing computational efficiency and enabling better resource allocation for AI applications [53][54]. - The report notes significant TCO advantages in KV Cache performance when utilizing CXL in high-concurrency, long-context workloads [56][59]. 6. Investment Recommendations - The report suggests focusing on storage industry chain-related entities, as AI-driven storage prices are expected to rise, leading to improved profit margins for manufacturers [79].
广发证券:CXL存储池化助力AI推理 建议关注CXL互连芯片相关厂商
Zhi Tong Cai Jing· 2025-10-10 03:02
Core Insights - CXL technology is enhancing computing efficiency through storage pooling and high-speed interconnects, with significant implications for AI applications [1] - Major players like NVIDIA and Alibaba Cloud are actively developing CXL capabilities to improve performance and resource utilization in AI systems [2][3] Group 1: CXL Technology Overview - CXL is an open high-speed serial protocol designed to facilitate communication between CPU, memory, and GPU, achieving higher data throughput and lower latency [1] - The technology supports efficient collaboration between accelerators like GPUs and FPGAs with main processors, addressing memory bandwidth bottlenecks and enhancing computational efficiency [1] Group 2: NVIDIA's Strategic Moves - NVIDIA has invested $5 billion in Intel to develop customized x86 CPUs for its AI infrastructure, leveraging Intel's role in the CXL alliance to enhance interoperability between NVLink and CXL technologies [2] - The acquisition of Enfabrica allows NVIDIA to integrate advanced AI interconnect technologies, including low-latency data paths and high-capacity memory support, optimizing GPU and CPU interconnects [2] Group 3: Alibaba Cloud's Innovations - Alibaba Cloud has launched the world's first CXL 2.0 Switch-based PolarDB database server, achieving ultra-low latency and high bandwidth for remote memory access [3] - The server enhances resource utilization and inference throughput by enabling collaborative pathways between GPU, CPU, and shared memory pools, positioning itself as a robust foundation for AI-driven data solutions [3]