Core Insights - The demand for high-bandwidth, large-capacity eSSD in AI workloads is expected to continue growing due to the rapid increase in long-context reasoning, RAG databases, and token scales [1][2] - The market space for eSSD in AI servers is projected to reach 59EB, 89EB, and 120EB in 2024, 2025, and 2026 respectively, indicating significant growth potential [2] - The eSSD market structure shows that AI server and storage server applications will drive the industry's focus towards AI-driven storage scenarios [3] Group 1: AI Workload and eSSD Demand - eSSD is primarily used in AI applications for training, inference, and data storage, with specific roles in checkpoint saving, model loading, and system booting during training [1] - During inference, long-context generated KVCache relies on high-performance eSSD for data overflow beyond memory capacity, while RAG databases depend on eSSD for large-scale knowledge retrieval [1] - The increasing scale of long-context reasoning and RAG databases will enhance the demand for high-capacity eSSD in AI workloads [1] Group 2: Market Projections - The estimated market space for AI server eSSD is based on assumptions from Nvidia's white paper, indicating a total cache capacity of 30TB per compute tray [2] - The theoretical maximum market space for AI server eSSD is projected to be 59EB in 2024, 89EB in 2025, and 120EB in 2026, although actual deployment may result in lower figures [2] - The eSSD market structure indicates that by 2030, the market sizes for general servers, AI servers, and storage servers will be 178EB, 134EB, and 614EB respectively, with storage servers showing the highest growth rate [3]
广发证券:AI&存储服务器用eSSD空间广阔 建议关注产业链核心受益标的