Core Viewpoint - Huawei has officially announced the open-source release of the Pangu 70 billion parameter dense model and the Pangu Pro MoE 720 billion parameter mixture of experts model, along with the inference technology based on Ascend, marking a significant step in promoting AI applications across various industries [2]. Group 1 - The Pangu Pro MoE 72B model weights and basic inference code have been officially launched on the open-source platform [3]. - The large-scale MoE model inference code based on Ascend has also been officially launched on the open-source platform [4]. - The weights and inference code for the Pangu 7B related models will be available on the open-source platform soon, inviting global developers, partners, and researchers to download and provide feedback [5]. Group 2 - Huawei's Ascend architecture has achieved a 20% acceleration in MoE training and a 70% reduction in memory usage, showcasing the capabilities of the Pangu architecture [7].
华为宣布开源盘古7B稠密和72B混合专家模型
雷峰网·2025-06-30 04:32