Core Viewpoint - The new wave of computing power will be driven by the explosion of AI inference, as stated by Lenovo's Chairman and CEO Yang Yuanqing during the CES keynote speech [6]. Group 1: Evolution of Computing Power Infrastructure - The global computing power infrastructure market has undergone four waves of innovation: the first wave focused on traditional computing for enterprise information and digital transformation; the second wave was driven by cloud services and applications, leading to the rapid rise of cloud computing; the third wave was characterized by large-scale computing clusters for training large language models, primarily in the cloud [3][8]. - The current trend is shifting from "training" to "inference," with a broad consensus in the global AI industry that local deployment of AI inference is becoming a true competitive advantage for enterprises [3][8]. Group 2: Local AI Inference Deployment - Local deployment of AI inference allows for faster response times as inference occurs closer to the data generation source, necessitating a hybrid computing infrastructure composed of public cloud, private cloud, local data centers, and edge computing [3][8]. - AMD's Chair and CEO Lisa Su agrees with this perspective, emphasizing the need for global enterprises to bring AI closer to their data while maintaining flexibility and the ability to evolve over time [3][8]. Group 3: New Product Launches - Lenovo has launched a comprehensive suite of inference optimization server products, including AI inference servers SR675i, SR650i, and edge computing server SE455i, aimed at enhancing inference efficiency, reducing operational costs, and strengthening data security to meet diverse and real-time AI deployment needs [4][9].
杨元庆:新一轮算力浪潮将源于AI推理的爆发|直击CES