Workflow
直击WAIC 2025丨对话云天励飞董事长陈宁:只有端、边、云协同,才能找到AI大规模落地最优解决方案

Core Insights - The rapid development of AI Agents is increasing the importance of inference chips, with major computing manufacturers showcasing new products at the WAIC 2025 [1] - Cloud Tianli Fei announced a focus on AI chips, aiming to build a domestic computing "accelerator" around three core areas: edge computing, cloud large model inference, and embodied intelligence [1] - The chairman and CEO of Cloud Tianli Fei, Chen Ning, believes that AI technology centered on large models and inference chips will redefine all electronic products in the next five years [1][2] Inference Chip Market Dynamics - Chen Ning compares the evolution of AI to a student graduating from university, indicating that the current phase is transitioning from an AI training era to an inference era [2] - The inference era will see AI empowering all electronic products, necessitating various types of inference chips from terminals to edge and cloud computing [2] - Chen Ning emphasizes the need for a collaborative approach between edge, cloud, and terminal computing to optimize cost-effectiveness in large-scale deployments [2][3] Cost Considerations in Inference Chips - In the transition to the inference era, the cost of inference will become increasingly important as AI becomes integrated into daily life [3] - The core concept of PPA (Performance, Power consumption, Area) is crucial in chip design, influencing the value and cost-effectiveness of chips from the user's perspective [3][4] - For edge computing, the balance between computational power, memory, and customized services is essential, with effective power and hardware cost being key factors [3][4] Performance Metrics for Cloud and Edge Inference - Cloud-side inference focuses on the hardware procurement costs and operational expenses of running inference chip clusters, while edge computing is more sensitive to the hardware costs and effective computational power in specific scenarios [4] - The effective computational power and hardware cost are critical in assessing the cost-performance ratio of hardware devices [4] - Cloud Tianli Fei is concentrating on building a high-cost performance inference chip technology and product system to drive the large-scale deployment of AI applications [4]