Workflow
人工智能开源战略
icon
Search documents
中信建投 | 阿里AI模型:产品矩阵丰富,开源生态卡位B端份额
Xin Lang Cai Jing· 2025-12-04 11:28
Core Insights - Alibaba is leveraging the Qwen large model foundation to comprehensively reshape its business and accelerate the construction of B-end ecological barriers through an open-source strategy and strong performance [2][42] - The company is committed to increasing capital expenditure to meet strong demand for computing power, with cloud revenue continuing to grow significantly, validating the closed-loop logic of "infrastructure investment - technology iteration - commercial monetization" [2][42] AI Model Development - Alibaba has been early in the AI model layout, with its flagship Qwen series iterating three major versions and multiple minor versions in just over two years, covering vertical scenarios such as text, mathematics, code, and multi-modal applications [3][43] - As the only major player adhering to an open-source strategy, Alibaba has accelerated model iterations since 2024, narrowing the capability gap with overseas models and is expected to surpass closed-source models in the B-end market [3][43] Model Capabilities and Updates - Alibaba's AI model layout has achieved "full size," "full modality," and "multi-scenario" coverage, with the first trillion-parameter multi-modal model M6 released in March 2021, and subsequent expansions in parameter size [4][43] - As of October 2025, the Qwen model series has iterated three major versions and multiple minor versions, covering model sizes from 0.5 billion to one trillion parameters [4][43] - The Qwen series has open-sourced a total of 357 models, with a significant acceleration in update frequency since 2024, including 71 models in the first half of 2024 and 120 in the second half [5][45] Competitive Positioning - Alibaba's model capabilities rank in the global first tier, with the gap to leading overseas firms reduced from over six months to approximately three months [12][51] - The Qwen3 235B model, released on July 22, 2025, is comparable to DeepSeek-V3.1 Terminus, while the Qwen2.5 Instruct-72B model was the first domestic model to exceed an overseas model in the open-source arena [12][51] Market Strategy and Ecosystem - The company has formed a complete layout for both B-end and C-end markets, with open-source serving as the foundation for business and product support [35][74] - As of the 2025 Yunqi Conference, the Tongyi series models have achieved 600 million downloads and served over 1 million customers, with the Mota community further enhancing the open-source ecosystem [35][74] Future Outlook - The focus will be on the official release of Qwen3-Next (equivalent to Qwen3.5) and optimizing vertical models based on it, with Qwen4 expected to be released in Q2 2026 [21][60] - The ongoing open-source strategy is anticipated to maintain the frequency of model updates, further enhancing capabilities while narrowing the time gap with leading overseas models [21][60]