Workflow
推理加速黑科技
icon
Search documents
华为,AI大动作!
证券时报· 2025-08-10 07:00
Core Viewpoint - Huawei is set to release groundbreaking technology in AI inference that may reduce China's reliance on HBM (High Bandwidth Memory) technology, enhancing the performance of domestic AI large model inference and improving the AI inference ecosystem in China [1]. Group 1: AI Inference Technology - Huawei will jointly launch the latest AI inference application results with China UnionPay on August 12, introducing a significant inference acceleration technology [1]. - HBM is crucial for addressing "data transportation" issues; insufficient HBM can lead to poor user experience in AI inference, resulting in task delays and slow responses [2]. Group 2: Forum and Expert Contributions - Experts from the China Academy of Information and Communications Technology, Tsinghua University, and iFlytek will share practices on large model inference acceleration and experience optimization at the "2025 Financial AI Inference Application Implementation and Development Forum" on August 12 [3]. Group 3: Event Schedule - The event schedule includes: - Opening remarks [5] - Introduction and release ceremony of UnionPay's inference application results [5] - Presentation of Huawei's AI storage inference acceleration solution [5] - Discussion on large model inference optimization and new paradigms for industrial implementation [5] - Presentation on KV Cache storage-centered large model inference architecture by Tsinghua University [5] - iFlytek's high-performance inference practices on the MaaS platform [5]