Workflow
百度发布文心大模型 X1.1、开源新模型,王海峰:飞桨文心生态开发者达 2333 万
AI前线·2025-09-11 05:33

Core Insights - Baidu has officially launched the Wenxin large model X1.1, which shows significant improvements in factuality, instruction adherence, and agent capabilities compared to its predecessor [4][10] - The Wenxin model X1.1 has achieved a 34.8% increase in factuality, a 12.5% increase in instruction adherence, and a 9.6% increase in agent performance [4] - The PaddlePaddle framework v3.2 has been released, enhancing training efficiency and compatibility with various chips, achieving a maximum operator kernel reuse rate of 92% [7][8] Model Launch and Performance - The Wenxin large model X1.1 is based on the Wenxin model 4.5 and utilizes an iterative mixed reinforcement learning training framework [4] - In benchmark evaluations, Wenxin X1.1 outperformed DeepSeek R1-0528 and matched the performance of top international models like GPT-5 and Gemini 2.5 Pro [4] Framework and Deployment Enhancements - The PaddlePaddle framework v3.2 includes core upgrades that significantly improve training efficiency, achieving a pre-training MFU of 47% on the ERNIE-4.5-300B-A47B model [7] - The FastDeploy suite enhances end-to-end inference performance, achieving high throughput rates of 57K tokens per second for input and 29K tokens per second for output under specific latency conditions [8] Open Source Initiatives - Baidu has open-sourced the ERNIE-4.5-21B-A3B-Thinking model, which excels in various tasks such as content creation and logical reasoning [8] - The company has also released a large-scale computation graph dataset, GraphNet, which includes over 2700 model computation graphs [8] Developer Ecosystem Growth - The PaddlePaddle ecosystem has reached 23.33 million developers and serves 760,000 enterprises [10] - The Wenxin code assistant, Wenxin Kuai Ma, has upgraded to version 3.5S, enhancing multi-agent collaboration capabilities and serving over 10 million developers [11]