Core Viewpoint - Kimi, an AI startup, is focusing on open-source model development, with the recent release of Kimi K2 Thinking, which has a training cost of $4.6 million, significantly lower than competitors like DeepSeek V3 and OpenAI's GPT-3 [3][4][6] Summary by Sections Model Development and Costs - Kimi has invested heavily in open-source model research and updates over the past six months, releasing Kimi K2 Thinking on November 6, with a reported training cost of $4.6 million, lower than DeepSeek V3's $5.6 million and OpenAI GPT-3's billions [3][4] - CEO Yang Zhilin clarified that the $4.6 million figure is not official, as most expenses are on research and experimentation, making it difficult to quantify training costs [4][6] Model Performance and Challenges - Users raised concerns about the reasoning length of Kimi K2 Thinking and discrepancies between leaderboard scores and actual performance. Yang stated that the model currently prioritizes absolute performance, with plans to improve token efficiency in the future [4][7] - The gap between leaderboard performance and real-world experience is expected to diminish as the model's general capabilities improve [7] Market Position and Strategy - Chinese open-source models are increasingly being utilized in the international market, with five Chinese models appearing in the top twenty of the OpenRouter model usage rankings [7] - Kimi currently can only be accessed via API due to interface issues with the OpenRouter platform [7] - Kimi plans to maintain its open-source strategy, focusing on the application and optimization of Kimi K2 Thinking while balancing text and multimodal model development, avoiding direct competition with leading firms like OpenAI [6][8]
Kimi杨植麟称“训练成本很难量化”,仍将坚持开源策略
第一财经·2025-11-11 12:04