Workflow
Token经济时代,AI推理跑不快的瓶颈是“存力”?
Tai Mei Ti A P P·2025-11-07 04:08

Core Insights - The AI industry is undergoing a structural shift, moving from a focus on GPU scaling to the importance of storage capabilities in enhancing AI performance and cost efficiency [1][10] - The demand for advanced storage solutions is expected to rise due to the increasing requirements of AI applications, with storage prices projected to remain bullish through Q4 2025 [1][10] - The transition from a "parameter scale" arms race to a "inference efficiency" commercial competition is anticipated to begin in 2025, emphasizing the significance of token usage in AI inference [2][10] Storage and Inference Changes - The fundamental changes in inference loads are driven by three main factors: the exponential growth of KVCache capacity due to longer contexts, the complexity of multi-modal data requiring advanced I/O capabilities, and the need for consistent performance under high-load conditions [4][10] - The bottleneck in inference systems is increasingly related to storage capabilities rather than GPU power, as GPUs often wait for data rather than being unable to compute [5][10] - Enhancing GPU utilization by 20% can lead to a 15%-18% reduction in overall costs, highlighting the importance of efficient data supply over merely increasing GPU numbers [5][10] New Storage Paradigms - Storage is evolving from a passive role to an active component in AI inference, focusing on data flow management rather than just capacity [6][10] - The traditional storage architecture struggles to meet the demands of high throughput, low latency, and heterogeneous data integration, which hinders AI application deployment [7][10] - New technologies, such as CXL and multi-level caching, are being developed to optimize data flow and enhance the efficiency of AI inference systems [6][10] Future Directions - The next three years will see a consensus on four key directions: the scarcity of resources will shift from GPUs to the ability to efficiently supply data to GPUs, the management of data will become central to AI systems, real-time storage capabilities will become essential, and CXL architecture will redefine the boundaries between memory and storage [10][11][12] - The competition in AI will extend beyond model performance to the underlying infrastructure, emphasizing the need for effective data management and flow [12]