Workflow
华为首个!重磅发布!
证券时报·2025-06-30 04:12

Core Viewpoint - Huawei's announcement to open source the Pangu 70 billion parameter dense model and the 720 billion parameter mixture of experts model (Pangu Pro MoE 72B) is a significant step in promoting the development and application of large model technology across various industries, aligning with its Ascend ecosystem strategy [1][7]. Group 1: Model Specifications and Performance - The newly open-sourced Pro MoE 72B model, with 720 billion parameters and 160 billion active parameters, demonstrates exceptional performance that can rival models with over a trillion parameters, according to the latest Super CLUE rankings [3][4]. - Huawei's Pangu Ultra MoE model, launched on May 30, features a parameter scale of 718 billion, showcasing advancements in training performance on the Ascend AI computing platform [4][5]. Group 2: Strategic Implications - The release of these models signifies Huawei's capability to create world-class large models based on its Ascend architecture, achieving a fully controllable training process from hardware to software [5]. - Huawei's unique approach in the large model strategy emphasizes practical applications across various industries, aiming to solve real-world problems and accelerate the intelligent upgrade of numerous sectors [5][7]. Group 3: Industry Impact - The Pangu large models have been implemented in over 30 industries and 500 scenarios, providing significant value in sectors such as government, finance, manufacturing, healthcare, and autonomous driving [5]. - The open-sourcing initiative is expected to attract more developers and vertical industries to create intelligent solutions based on the Pangu models, further enhancing the integration of AI across different fields [7].