Core Insights - The cost of inference for large models is decreasing rapidly, at a rate of tenfold per year, which is crucial for the explosion of AI-First applications [2] - The year 2025 is anticipated to be a pivotal year for the explosion of AI-First applications and the practical implementation of large models [2] - The growth of pre-training scaling laws is slowing down due to data volume and computational limitations, but a new "slow thinking" scaling law is emerging that focuses on longer inference times for better results [2][3] Industry Developments - The performance growth of models under the slow thinking scaling law is accelerating, indicating significant potential for further advancements [3] - The Chinese market is experiencing a "DeepSeek Moment," which is facilitating the awakening of AI-First applications, overcoming previous barriers [3] - The company has made strategic adjustments to fully embrace DeepSeek, focusing on transforming high-quality base models into enterprise-level customized solutions, akin to creating a Windows system for the AI 2.0 era [3]
李开复:零一万物正基于DeepSeek,打造AI 2.0时代的Windows