Workflow
华为宣布开源盘古7B稠密和72B混合专家模型
财联社·2025-06-30 06:21

Core Viewpoint - Huawei officially announced the open-source release of the Pangu 70 billion parameter dense model and the Pangu Pro MoE 720 billion parameter mixture of experts model, as well as model inference technology based on Ascend, marking a significant step in promoting the research and innovation of large model technology and accelerating the application and value creation of artificial intelligence across various industries [1]. Group 1 - The Pangu Pro MoE 72B model weights and basic inference code have been officially launched on the open-source platform [2]. - The inference code for the ultra-large-scale MoE model based on Ascend has been officially launched on the open-source platform [3]. - The weights and inference code for the Pangu 7B related model will be available on the open-source platform soon, inviting global developers, enterprise partners, and researchers to download and provide feedback for improvement [4].