Core Insights - The Tsinghua University research team has proposed a "density law" for large language models, indicating that the maximum capability density of these models is growing exponentially over time, doubling approximately every 3.5 months from February 2023 to April 2025 [1][2] Group 1: Density Law and Its Implications - The density law reveals that the focus should shift from the size (parameter count) of large models to their "capability density," which measures the intelligence per unit of parameters [2] - The research analyzed 51 open-source large models and found that the maximum capability density has been increasing exponentially, with a notable acceleration post-ChatGPT release, where the density doubled every 3.2 months compared to every 4.8 months before [2] Group 2: Cost and Efficiency - Higher capability density implies that large models become smarter while requiring less computational power and lower costs [3] - The ongoing advancements in capability density and chip circuit density suggest that large models, previously limited to cloud deployment, can now run on terminal chips, enhancing responsiveness and user privacy [3] Group 3: Application in Industry - The application of the density law indicates that AI is becoming increasingly accessible, allowing for more proactive services in smart vehicles, transitioning from passive responses to active decision-making [3]
大模型不再拼“块头”——大语言模型最大能力密度随时间呈指数级增长
Ke Ji Ri Bao·2025-11-25 00:13