Core Insights - Tesla and xAI founder Elon Musk have shown support for Chinese domestic large models, which has attracted attention in the industry [1][3] - The global programming tool Cursor released its self-developed coding model Composer 2, which surpassed Claude Opus 4.6 in evaluations and emphasizes cost-effectiveness [1] - Composer 2 is based on the Kimi K2.5 model, which Musk acknowledged on social media [1] Group 1 - The Kimi team expressed gratitude using the Chinese phrase "Thank you for being you," showcasing a blend of technical confidence and warmth [3] - On March 16, Kimi released a technical report titled "Attention Residuals," which restructured the residual connection mechanism of large models, achieving a 1.25 times improvement in training efficiency on a 48 billion parameter model, with scientific reasoning and mathematical performance increasing by 7.5% and 3.6% respectively [3] - Musk praised Kimi's work on social media, highlighting the impressive nature of their achievements [3] Group 2 - On March 2, Alibaba's Qwen officially open-sourced four small-sized models: Qwen3.5-0.8B, 2B, 4B, and 9B, which Musk commented on, noting the impressive intelligence density [3] - ByteDance's new video generation model Seedance 2.0 began internal testing on February 12, addressing industry pain points such as low usability and character detail drift, capable of generating 60 seconds of 2K broadcast-quality video [3] - Musk expressed amazement at the rapid advancements in AI technology, stating "It's happening fast" in response to developments in Seedance 2.0 [3]
“这就是Kimi”!马斯克冲上热搜,两度点赞中国AI公司月之暗面
证券时报·2026-03-21 08:57