Workflow
大模型离商业应用最近、泡沫最小?业界热议“AI泡沫论”
Nan Fang Du Shi Bao·2025-11-17 16:34

Core Viewpoint - The ongoing global investment in AI has sparked debates about whether it represents a revolutionary advancement or a speculative bubble, with experts suggesting that the current AI investment wave is unlikely to cool down regardless of its classification [1][3]. Group 1: Perspectives on AI Investment - Michael Spence views the current AI investment trend as a "rational bubble," arguing that the costs of under-investment outweigh those of over-investment [1]. - Wu Buxi, CEO of Hangzhou Darwen Intelligent Co., believes that while AI does exhibit bubbles, particularly in areas like embodied intelligence and video generation, the potential for AI to surpass previous industrial revolutions remains strong [3]. - Ma Jing from Ant Group emphasizes the disconnect between technological capabilities and real demand, highlighting that many AI products have yet to achieve practical application [5]. Group 2: Technological and Market Dynamics - The rapid advancement in computing power, from Deep Blue in 1998 to the current requirements for GPT-level training, indicates that humanity is at a pivotal moment in AI development [4]. - The uneven distribution of AI bubbles across different sectors suggests that while some areas are overhyped, others, like large models, are closer to commercial viability [3][5]. - The expectation of achieving AGI within a decade is seen as unrealistic, contributing to the perception of a bubble in AI investments [6]. Group 3: Strategies for Addressing AI Bubbles - Wu Buxi advocates for a "survival of the fittest" approach, where startups must adapt to real-world demands to avoid ineffective investments [8]. - Ma Jing suggests that improving asset utilization and clarifying business models can help the industry return to rationality, particularly in smart manufacturing and foundational layers [5]. - The importance of open-source initiatives is highlighted as a means to mitigate bubbles by fostering equal competition and reducing information asymmetry [9].