Workflow
现代化数字基座
icon
Search documents
AI应用落地也需要“去中心化”丨ToB产业观察
Tai Mei Ti A P P· 2025-10-22 03:05
Core Insights - 79% of surveyed enterprises believe that generative AI will have a disruptive impact on their business within the next 18 months, which is 12 percentage points higher than the Asia-Pacific average [2] - 37% of enterprises have deployed generative AI in production environments, while 61% are in the testing and proof-of-concept stages, indicating a shift from the "PPT stage" to the "practical stage" in AI implementation [2] - The key focus for 2023-2024 is the "large model parameter competition," with enterprises pursuing "hundred billion-level parameters" and "multimodal capabilities" [2] - By 2025, the emphasis will shift to "scenario implementation," where businesses seek to solve real-world problems with AI [2] Infrastructure Strategy - Enterprises in the Asia-Pacific region recognize that centralized cloud architectures cannot meet the growing demands for scale, speed, and compliance, necessitating a rethink of infrastructure strategies to include edge services [2] - The reliance on public cloud for production applications has exposed shortcomings, particularly in the context of generative AI [4] - 37% of enterprises that have deployed generative AI report that over 60% experience unexpected delays in real-time interactive applications, with conversion rates dropping by 40% due to latency issues [4] Edge Computing Emergence - The traditional reliance on public cloud is insufficient for all enterprises to embrace AI, leading to a need for a modernized digital foundation that integrates edge computing [5] - Edge computing is becoming a core technology for building the next generation of digital infrastructure, enabling distributed deployment to reduce latency and improve responsiveness [5][6] - By 2024, the global market for edge cloud is expected to reach 185.1 billion yuan, with China accounting for approximately 70% [6] Investment Directions - Future investments in edge IT will focus on supporting digital operations, ensuring business continuity when disconnected from core or cloud resources, and reducing connectivity costs [7] - The integration of generative AI and edge computing is bridging the gap between centralized cloud resources and distributed edge environments, ensuring scalability and performance [10] Six Pillars of AI-Ready Infrastructure - The report outlines six core pillars for building AI-ready infrastructure, emphasizing a holistic approach that extends from core to edge [11] - Pillar one focuses on making infrastructure adaptable to AI, enhancing efficiency and user experience through hardware optimization and personalized application support [12] - Pillar two highlights the shift from large model competition to edge adaptation, requiring hardware investments in edge-level GPUs and heterogeneous computing chips [14] - Pillar three emphasizes modernizing edge IT to extract value at the data source, reducing data transmission volumes significantly [15] - Pillar four addresses the need for a unified scheduling of distributed resources to avoid "edge island" scenarios [16] - Pillar five advocates for extending existing public cloud investments to edge deployments, emphasizing interoperability and data consistency [17] - Pillar six focuses on autonomous operations driven by AI, enhancing monitoring, resource allocation, and fault recovery capabilities [18]