Workflow
规模定律
icon
Search documents
专家:2035年机器人数量或比人多
AI产业在过去一年来,也呈现五大新趋势。 张亚勤分析道,第一大趋势是从鉴别式AI到生成式AI,如今则走向智能体AI。 其中重要标志是,过去7个月间,智能体AI的任务长度翻倍、准确度超过50%,由此可以加速让智能体 应用到每个领域。 第二大趋势是过去一年来,在预训练阶段的规模定律(Scaling Law)已经放缓,更多工作转移到训练 后的如推理、智能体应用等阶段。 视频丨实习生唐娜斯 AI产业快速发展,正让诸多行业的迭代呈现加速度趋势。 2025骁龙峰会·中国期间,中国工程院外籍院士、清华大学智能产业研究院(AIR)院长张亚勤在演讲中 指出,新一代人工智能是原子、分子和比特的融合,是信息智能、物理智能和生物智能的融合。这将带 来巨大产业机遇。 从产业规模看,移动互联比PC互联时代至少大10倍;在工智能时代,整个产业规模将比前一代至少大 100倍。 同时具身智能也将快速爆发,预计在十年后的2035年,机器人有望比人类数量还多。 由此也延伸出第四大趋势,即AI风险正快速上升。"智能体出现后,让AI风险至少增加了一倍。"张亚勤 补充道,这尤其需要全球企业和政府对此投入更多精力,他本人也对此花了很多时间。 第五大趋势, ...
揭秘小鹏自动驾驶「基座模型」和 「VLA大模型」
自动驾驶之心· 2025-09-17 23:33
汽车行业先进个人与团队关注 Vehicle, 一起智能、出海、成长 作者 | Pirate Jack 来源 | Vehicle 以下文章来源于Vehicle ,作者Pirate Jack Vehicle . 点击下方 卡片 ,关注" 自动驾驶之心 "公众号 戳我-> 领取 自动驾驶近30个 方向 学习 路线 | | 自动驾驶之心-国内最大的自驾技术平台 | | | --- | --- | --- | | 学习社区 | 论文辅导 | 在线课程 | | 产品宣传 | 内推求职 | 展会服务 | | 企业咨询 | 硬件教具 - | 项目对接 | >>自动驾驶前沿信息获取 → 自动驾驶之心知识星球 本文只做学术分享,如有侵权,联系删文 2025 年 的 CVPR 自 动 驾 驶 Workshop 上 , 小 鹏 汽 车 的 Liu Xianming 先 生 做了一篇名为《Scaling up Autonomous Driving via Large Foundation Models》的演讲。 之前,网络上有不少小鹏此次CVPR的 VLA演讲信息,但那些是别人想让你看到的广告推文。本文根据 Liu Xianming ...
本轮AI算力行情的驱动因素
淡水泉投资· 2025-09-17 10:06
Core Viewpoint - The AI market has evolved through significant phases, with a current shift from training-driven demand to inference-driven demand, leading to a new wave of growth in capital expenditure related to AI [1][2]. Group 1: Scaling Law and Demand - The "scaling law" indicates that increased investment in GPUs and computational power enhances AI performance, transitioning from pre-training to post-training and now focusing on inference [2][4]. - In 2023, the scaling law is primarily evident in the pre-training phase, while in 2024, it will shift towards post-training, optimizing models for specific tasks [2]. - The demand for inference has surged, with applications in programming, search, and image processing, leading to a 50-fold increase in monthly token consumption for Google's Gemini in just one year [4][7]. Group 2: Capital Investment Trends - The AI industry is witnessing annual capital investments amounting to hundreds of billions, benefiting upstream sectors including GPUs, high-speed interconnect solutions, power supply, and cooling systems [7][8]. - Investment in computing power can be categorized into overseas and domestic sectors, each with distinct investment logic [7]. Group 3: Overseas Computing Power - Product upgrades in overseas computing power focus on higher performance products, enhancing value in specific segments, driven by chip and interconnect upgrades [8][10]. - Price-sensitive upstream segments are affected by downstream demand fluctuations, leading to supply bottlenecks and price increases, exemplified by the PCB industry [9]. Group 4: Domestic Computing Power - The gap in computing power between U.S. and Chinese internet companies is widening, with U.S. companies doubling their computing reserves annually, while domestic growth, though rapid, lags behind due to high-end chip export restrictions [13][15]. - Domestic GPUs are improving, with some models now matching the performance of NVIDIA's lower-tier offerings, indicating potential for competitiveness [15]. - The shift in AI demand from training to inference favors domestic computing power, allowing it to meet specific customer needs in certain scenarios [15][16]. Group 5: Market Dynamics and Future Outlook - The AI industry is characterized by high uncertainty, with rapid changes in trends, necessitating a cautious yet proactive approach to investment in AI computing power [16].
张宏江外滩大会分享:基础设施加速扩张,AI步入“产业规模化”
Bei Ke Cai Jing· 2025-09-11 07:09
Core Insights - The "Scaling Law" for large models remains valid, indicating that higher parameter counts lead to better performance, although the industry perceives a gradual slowdown in pre-trained model scaling [3] - The emergence of reasoning models has created a new curve for large-scale development, termed "reasoning scaling," which emphasizes the importance of context and memory in computational demands [3] - The cost of using large language models (LLMs) is decreasing rapidly, with the price per token dropping significantly over the past three years, reinforcing the scaling law [3] - AI is driving massive infrastructure expansion, with significant capital expenditures expected in the AI sector, projected to exceed $300 billion by 2025 for major tech companies in the U.S. [3] - The AI data center industry has experienced a construction boom, which is expected to stimulate the power ecosystem and economic growth, reflecting the core of "AI industrial scaling" [3] Industry Transformation - Humanity is entering the "agent swarm" era, characterized by numerous intelligent agents interacting, executing tasks, and exchanging information, leading to the concept of "agent economy" [4] - Future organizations will consider models and GPU computing power as core assets, necessitating an expansion of computing power to enhance model strength and data richness [4] - The integration of "super individuals" and agents is anticipated to bring about significant structural changes in enterprise processes [4]
源码资本张宏江:AI 步入“产业规模化”
Hua Er Jie Jian Wen· 2025-09-11 06:07
Group 1 - The core viewpoint is that AI is advancing rapidly despite existing disagreements, with significant implications for the economy and society due to the emergence of large language models and intelligent agents [1][2] - Scaling Law remains a fundamental principle for improving the performance of large models, with the introduction of "inference scaling law" indicating a new curve for large-scale development [1] - The cost of using large models is decreasing, as indicated by the rapid decline in the price per token over the past three years, which will further reinforce the scaling law [1][2] Group 2 - AI is driving large-scale expansion of infrastructure, with significant capital expenditures expected in the AI sector, projected to exceed $300 billion by major tech companies in the U.S. by 2025 [2] - The large-scale construction in the AI data center industry is expected to stimulate the power ecosystem and economic growth, reflecting the core of "AI industry scaling" [2] - The emergence of the "agent swarm" era signifies a future where numerous intelligent agents interact and exchange tasks and information, leading to the development of an "agent economy" [2]
张宏江:基础设施加速扩张 AI正步入“产业规模化”
Yang Guang Wang· 2025-09-11 05:07
Group 1 - The core principle of "Scaling Law" for large models remains valid, indicating that higher parameters lead to better performance [2] - The emergence of reasoning models has created a new curve for large-scale development, termed "reasoning scaling" [2] - The rapid decline in the cost per token for large language models (LLM) over the past three years will further reinforce the scaling law [2] Group 2 - AI is driving large-scale expansion of infrastructure, with the AI data center industry experiencing significant construction activity over the past year [2] - The large-scale construction in the IDC industry will stimulate the power ecosystem and economic development, reflecting the core of "AI industrial scaling" [2] Group 3 - Humanity is entering the "agent swarm" era, characterized by numerous agents interacting, executing tasks, and exchanging information [3] - The interaction between humans and agent swarms will form the basis of the "agent economy" [3] - Models and GPU computing power will become core assets for future organizations, necessitating the expansion of computing power to enhance models and enrich data [3]
GPT-5“让人失望”,AI“撞墙”了吗?
Hua Er Jie Jian Wen· 2025-08-17 03:00
Core Insights - OpenAI's GPT-5 release did not meet expectations, leading to disappointment among users and raising questions about the future of AI development [1][3] - The focus of the AI race is shifting from achieving AGI to practical applications and cost-effective productization [2][7] Group 1: Performance and Expectations - GPT-5's performance was criticized for being subpar, with users reporting basic errors and a lack of significant improvements over previous models [1][3] - The release has sparked discussions about whether the advancements in generative AI have reached their limits, challenging OpenAI's high valuation of $500 billion [1][5] Group 2: Market Sentiment and Investment - Despite concerns about technological stagnation, investor enthusiasm for AI applications remains strong, with AI accounting for 33% of global venture capital this year [6][8] - Companies are increasingly focusing on integrating AI models into products, with OpenAI deploying engineers to assist clients, indicating a shift towards practical applications [7][8] Group 3: Challenges and Limitations - The "scaling laws" that have driven the development of large language models are approaching their limits due to data exhaustion and the physical and economic constraints of computational power [5][6] - Historical parallels are drawn to past "AI winters," with warnings that inflated expectations could lead to a rapid loss of investor confidence [6] Group 4: Future Directions - The industry is moving towards multi-modal data and "world models" that understand the physical world, suggesting potential for future innovation despite current limitations [7] - Investors believe there is still significant untapped value in current AI models, with strong growth in products like ChatGPT contributing to OpenAI's recurring revenue of $12 billion annually [8]
苹果和多家科技巨头唱反调
news flash· 2025-07-12 14:55
Core Insights - The competition in the AI field is increasingly focusing on "reasoning capabilities" as major tech companies like OpenAI, Google, and Anthropic race to develop large models with enhanced reasoning abilities [1] - Nvidia's CEO Jensen Huang emphasized the scale law, stating that larger models trained on more data lead to better performance and quality in intelligent systems [1] - A recent report from Apple titled "The Illusion of Thinking" challenges the prevailing trend by demonstrating that current leading models struggle with complex reasoning tasks, showing near-zero accuracy under such conditions [1] - There are speculations that Apple's report may be a strategic move, as the company is currently lagging behind its competitors in the large model race [1]
研报金选丨别急着找下一个宁德时代,跟着这些“卖水人”能吃肉
第一财经· 2025-06-20 02:38
Group 1 - The computing power sector is experiencing significant growth, with Nvidia reducing costs by 70%, and analysts optimistic about an 80% penetration rate and a market size of $40 billion [4][5] - The demand for low-power, high-speed cluster solutions is driving the need for higher integration, which may provide better solutions [6] - Leading communication equipment manufacturers have mature solutions, indicating that the CPO switch industry may soon be industrialized [7] Group 2 - The compound annual growth rate (CAGR) for shipments in the next five years is expected to reach 123%, with a market opportunity of $250 billion on the horizon [9][10] - The solid-state battery industry is accelerating its 0-1 industrialization due to increasing support from policies and applications [10] - The global market for solid-state batteries is projected to exceed $250 billion by 2030, with rapid growth expected in the domestic market by 2027 [12]
GPU集群怎么连?谈谈热门的超节点
半导体行业观察· 2025-05-19 01:27
Core Viewpoint - The article discusses the emergence and significance of Super Nodes in addressing the increasing computational demands of AI, highlighting their advantages over traditional server architectures in terms of efficiency and performance [4][10][46]. Group 1: Definition and Characteristics of Super Nodes - Super Nodes are defined as highly efficient structures that integrate numerous high-speed computing chips to meet the growing computational needs of AI tasks [6][10]. - Key features of Super Nodes include extreme computing density, powerful internal interconnects using technologies like NVLink, and deep optimization for AI workloads [10][16]. Group 2: Evolution and Historical Context - The concept of Super Nodes evolved from earlier data center designs focused on resource pooling and space efficiency, with significant advancements driven by the rise of GPUs and their parallel computing capabilities [12][13]. - The transition to Super Nodes is marked by the need for high-speed interconnects to facilitate massive data exchanges between GPUs during model parallelism [14][21]. Group 3: Advantages of Super Nodes - Super Nodes offer superior deployment and operational efficiency, leading to cost savings [23]. - They also provide lower energy consumption and higher energy efficiency, with potential for reduced operational costs through advanced cooling technologies [24][30]. Group 4: Technical Challenges - Super Nodes face several technical challenges, including power supply systems capable of handling high wattage demands, advanced cooling solutions to manage heat dissipation, and efficient network systems to ensure high-speed data transfer [31][32][30]. Group 5: Current Trends and Future Directions - The industry is moving towards centralized power supply systems and higher voltage direct current (DC) solutions to improve efficiency [33][40]. - Next-generation cooling solutions, such as liquid cooling and innovative thermal management techniques, are being developed to support the increasing power density of Super Nodes [41][45]. Group 6: Market Leaders and Innovations - NVIDIA's GB200 NVL72 is highlighted as a leading example of Super Node technology, showcasing high integration and efficiency [37][38]. - Huawei's CloudMatrix 384 represents a strategic approach to achieving competitive performance through large-scale chip deployment and advanced interconnect systems [40].