马斯克:AI算力之争,中国已领先一步
Sou Hu Cai Jing·2026-01-12 00:56

Group 1 - The core argument presented by Elon Musk is that the future competition in AI computing power will hinge on electricity rather than chips [4][5] - China has established a structural advantage in meeting the electricity demands of large AI data centers, which can consume power equivalent to that of a small city [5][6] - By 2026, China's annual electricity generation is projected to reach three times that of the United States, indicating that the limiting factor for AI development will be electricity supply rather than computing power [6][7] Group 2 - Goldman Sachs predicts that by 2030, China will have a "global-level electricity redundancy," allowing it to support the surge in electricity demand from the AI industry, while the electricity gap for U.S. data centers is expanding at a rate of 15% annually [7][8] - Increasingly, tech giants are recognizing the importance of electricity on par with chips, with calls for the U.S. to elevate energy investment to a strategic level similar to that of chip development [8][9] - Morgan Stanley has raised its forecast for the electricity gap in U.S. data centers by 35%, indicating that China will have a higher "available computing power" under similar chip supply conditions [9][10] Group 3 - The shift in capital investment is evident, with global tech companies reassessing their computing power locations to prioritize areas with stable, low-cost electricity [11][12] - The rules of AI competition are evolving from a focus on algorithms and chips to include electricity and infrastructure resilience, suggesting that control over electricity will dictate the long-term limits of AI capabilities [11][12] - Plans for three supercomputing centers in Europe have been halted, with resources redirected to regions in western China rich in hydropower [12]