Workflow
DeepSeek V3.2 Exp
icon
Search documents
Kimi杨植麟称“训练成本很难量化”,仍将坚持开源策略
第一财经· 2025-11-11 12:04
Core Viewpoint - Kimi, an AI startup, is focusing on open-source model development, with the recent release of Kimi K2 Thinking, which has a training cost of $4.6 million, significantly lower than competitors like DeepSeek V3 and OpenAI's GPT-3 [3][4][6] Summary by Sections Model Development and Costs - Kimi has invested heavily in open-source model research and updates over the past six months, releasing Kimi K2 Thinking on November 6, with a reported training cost of $4.6 million, lower than DeepSeek V3's $5.6 million and OpenAI GPT-3's billions [3][4] - CEO Yang Zhilin clarified that the $4.6 million figure is not official, as most expenses are on research and experimentation, making it difficult to quantify training costs [4][6] Model Performance and Challenges - Users raised concerns about the reasoning length of Kimi K2 Thinking and discrepancies between leaderboard scores and actual performance. Yang stated that the model currently prioritizes absolute performance, with plans to improve token efficiency in the future [4][7] - The gap between leaderboard performance and real-world experience is expected to diminish as the model's general capabilities improve [7] Market Position and Strategy - Chinese open-source models are increasingly being utilized in the international market, with five Chinese models appearing in the top twenty of the OpenRouter model usage rankings [7] - Kimi currently can only be accessed via API due to interface issues with the OpenRouter platform [7] - Kimi plans to maintain its open-source strategy, focusing on the application and optimization of Kimi K2 Thinking while balancing text and multimodal model development, avoiding direct competition with leading firms like OpenAI [6][8]
Kimi杨植麟称“训练成本很难量化”,仍将坚持开源策略
Di Yi Cai Jing· 2025-11-11 10:35
Core Insights - Kimi, an AI startup, has released its latest open-source model, Kimi K2 Thinking, with a reported training cost of $4.6 million, significantly lower than competitors like DeepSeek V3 at $5.6 million and OpenAI's GPT-3, which costs billions to train [1][2] - The company emphasizes ongoing model updates and improvements, focusing on absolute performance while addressing user concerns regarding inference length and performance discrepancies [1] - Kimi's strategy includes maintaining an open-source approach and advancing the Kimi K2 Thinking model while avoiding direct competition with major players like OpenAI through innovative architecture and cost control [2][4] Model Performance and Market Position - In the latest OpenRouter model usage rankings, five Chinese open-source models, including Kimi's, are among the top twenty, indicating a growing presence in the international market [2] - Kimi's current model can only be accessed via API due to platform limitations, but the team is utilizing H800 GPUs with InfiniBand technology for training, despite having fewer resources compared to U.S. high-end GPUs [2] - The company plans to balance text model development with multi-modal model advancements, aiming to establish a differentiated advantage in the AI landscape [4]