AI应用落地也需要“去中心化”
Tai Mei Ti A P P·2025-10-22 09:42

Core Insights - 79% of surveyed enterprises believe that generative AI will have a disruptive impact on their business within the next 18 months, which is 12 percentage points higher than the Asia-Pacific average [1] - 37% of enterprises have deployed generative AI in production environments, while 61% are in the testing and proof-of-concept stages, indicating a shift from the "PPT stage" to the "practical stage" in AI implementation [1] - The key focus for 2023-2024 is the "large model parameter competition," with enterprises pursuing "hundred billion-level parameters" and "multimodal capabilities" [1] - By 2025, the emphasis will shift to "scenario implementation," where businesses seek to solve real-world problems with AI [1] Infrastructure Strategy - Enterprises in the Asia-Pacific region recognize that centralized cloud architectures cannot meet the growing demands for scale, speed, and compliance, necessitating a rethink of infrastructure strategies to include edge services [1] - The need for a modern digital foundation that integrates "cloud-core-edge computing" is emphasized to deploy intelligent services closer to users and applications [2] Challenges in AI Implementation - 37% of enterprises that have deployed generative AI report that over 60% experience unexpected delays in real-time interactive applications, with conversion rates dropping by 40% due to latency issues [3] - Cost is a significant barrier for many enterprises in adopting AI applications, as the massive data generated by AI inference increases bandwidth costs [3] - 72% of outbound enterprises have been forced to abandon centralized cloud processing due to compliance requirements related to "data outbound," particularly concerning user privacy data [3] Edge Computing Emergence - The traditional reliance on public cloud models is insufficient for all enterprises to embrace AI, leading to the urgent need for a restructured digital foundation that incorporates edge computing [4] - Edge computing is positioned as a core technology for building the next generation of digital infrastructure, enabling distributed deployment to reduce latency and enhance business responsiveness [4] Market Trends and Predictions - The global market for edge cloud is projected to reach 185.1 billion yuan in 2024, with China accounting for approximately 70% of this market [5] - By 2025, edge IT is expected to be the most significant area of IT spending growth for most Chinese enterprises, with 80% of CIOs in the Asia-Pacific region relying on edge services to support AI workloads by 2027 [6] Investment Directions - Future investments in edge IT will focus on four areas: supporting digital operations like AI and IoT, ensuring business continuity when disconnected from core or cloud resources, supporting operations in remote areas, and reducing connectivity costs [7] Integration of Generative AI and Edge Computing - The integration of generative AI and edge computing is bridging the gap between centralized cloud resources and distributed edge environments, ensuring scalability and performance [9] Six Pillars of AI-Ready Infrastructure - The report outlines six core pillars for building AI-ready infrastructure, emphasizing a holistic approach that extends from core to edge [10] Pillar 1: AI Readiness - Infrastructure must be adapted for AI, focusing on hardware optimization and personalized application support to enhance efficiency and user experience [11][12] Pillar 2: GenAI Deployment - Deployment of generative AI is shifting from a focus on large model parameters to lightweight adaptations for edge environments, necessitating hardware investments [13] Pillar 3: Modern Edge IT - Modern edge IT emphasizes extracting value at the data source, prioritizing edge inference and efficient data storage strategies [14] Pillar 4: Edge Optimization Architecture - A unified scheduling solution for distributed resources is essential to avoid "edge island" scenarios, with a three-layer architecture proposed [15] Pillar 5: Cloud to Edge - Existing public cloud investments should be leveraged for edge deployment, focusing on interoperability and data consistency [16] Pillar 6: Autonomous Operations - As edge nodes scale, AI-driven management of infrastructure becomes crucial, enhancing operational efficiency and reducing downtime [17][18]