Workflow
推理、训练、数据全链条的工程挑战,谁在构建中国 AI 的底层能力?|AICon 北京
AI前线·2025-06-16 07:37

Core Viewpoint - The rapid evolution of large models has shifted the focus from the models themselves to systemic issues such as slow inference, unstable training, and data migration challenges, which are critical for the scalable implementation of technology [1] Group 1: Key Issues in Domestic AI - Domestic AI faces challenges including computing power adaptation, system fault tolerance, and data compliance, which are essential for its practical application [1] - The AICon conference will address seven key topics focusing on the infrastructure of domestic AI, including native adaptation of domestic chips for inference and cloud-native evolution of AI data foundations [1] Group 2: Presentations Overview - The "Chitu Inference Engine" by Qingcheng Jizhi aims to efficiently deploy FP8 precision models on domestic chips, overcoming reliance on NVIDIA's Hopper architecture [4] - Huawei's "DeepSeek" architecture will discuss performance optimization strategies for running large models on domestic computing platforms [5][6] - JD Retail's presentation will cover the technical challenges and optimization practices for high throughput and low latency in large language models used in retail applications [7] - Alibaba's session will explore the design and future development of reinforcement learning systems, emphasizing the complexity of algorithms and system requirements [8] - The "SGLang Inference Engine" will present an efficient open-source deployment solution that integrates advanced technologies to reduce inference costs [9] - Ant Group will share insights on stability practices in large model training, focusing on distributed training fault tolerance and performance analysis tools [10] - Zilliz will discuss the evolution of data infrastructure for AI, including vector data migration tools and cloud-native data platforms [11]