AI推理掀起云平台变革 边缘计算成厂商角逐的新沃土
Zhong Guo Jing Ying Bao·2025-11-12 11:47

Core Insights - The demand for AI infrastructure is expanding significantly as AI applications evolve, with a shift from centralized cloud architectures to edge computing for real-time AI processing [1][2][5] - Akamai and NVIDIA have launched the Akamai Inference Cloud, a distributed generative edge platform designed for low-latency, real-time AI processing globally [1][5] - The AI inference workload is expected to far exceed training workloads, necessitating a reevaluation of computational infrastructure to support real-time AI processing demands [2][3] Industry Trends - The AI industry is transitioning from model development to practical application, with AI applications evolving from simple request-response models to complex multi-step reasoning and real-time decision-making [2][3] - Edge computing is becoming essential for AI inference, moving away from its previous role as a support for centralized cloud services to a primary function that enhances user experience and operational efficiency [2][3] Market Potential - The global edge AI market is projected to exceed $140 billion by 2032, a significant increase from $19.1 billion in 2023, indicating explosive growth [4] - The edge computing market could reach $3.61 trillion by 2032, with a compound annual growth rate (CAGR) of 30.4% [4] Competitive Landscape - Major tech companies, including Google, Microsoft, and Amazon, are actively investing in edge computing, leveraging their technological strengths and large user bases [5][6] - Akamai has established a global platform with over 4,200 edge nodes, enhancing its capability to support AI inference services and improve competitiveness in overseas markets [6]