存力中国行暨先进存力AI推理工作研讨会在京顺利召开
Zheng Quan Ri Bao Wang·2025-11-07 07:29

Core Insights - The conference focused on the role of advanced storage in empowering AI model development in the AI era [1][2] - Key experts from various organizations discussed the challenges and solutions related to AI inference and storage technology [2][3][4] Group 1: Advanced Storage and AI Inference - The chief expert from the China Academy of Information and Communications Technology emphasized that advanced storage is crucial for improving AI inference efficiency and controlling costs [2] - The national policies highlight the importance of advancing storage technology and enhancing the storage industry's capabilities [2] - A working group was established to promote collaboration and innovation in storage technology within the AI inference sector [2] Group 2: Technical Challenges and Solutions - Current challenges in AI inference include the need for upgraded KV Cache storage, multi-modal data collaboration, and bandwidth limitations [3] - China Mobile is implementing layered caching, high-speed data interconnects, and proprietary high-density servers to enhance storage efficiency and reduce costs [3] - Huawei's UCM inference memory data management technology addresses the challenges of data management, computational power supply, and cost reduction in AI applications [4] Group 3: Industry Collaboration and Future Directions - The conference facilitated discussions among industry experts from various companies, contributing to the consensus on the future direction of the storage industry [5] - The focus is on enhancing computational resource utilization and addressing issues related to high concurrency and low latency in AI inference [4][5] - The successful hosting of the conference is seen as a step towards fostering innovation and collaboration in the storage industry [5]