Core Insights - The article emphasizes that the future of AI development in the U.S. will follow the path of "electricity - computing power - implementation" [2] - There are significant opportunities in energy layout and challenges to Nvidia's future position in the AI market [4] Electricity as the Foundation of AI - Electricity is described as the "food" for AI, as computing power requires electricity [5] - China has advantages in energy sources such as hydropower, thermal power, and wind/solar energy, along with leading ultra-high voltage transmission technology [6] - The article highlights the importance of affordable and stable electricity for computing power, particularly in the context of small nuclear reactors in the U.S. [8] Computing Power: Transition from Training to Inference - The demand for AI computing power is divided into two segments: training and inference [9] - Training involves making AI smarter, requiring flexible and programmable GPUs, which Nvidia specializes in [10] - Inference, which involves using AI for practical applications, is expected to see a significant increase in demand, as many companies have not yet fully utilized AI [10] - The article cites a MIT survey indicating that 95% of companies investing in AI have not effectively implemented it, suggesting a potential for massive growth in inference computing power [10] Nvidia's Position and Challenges - Nvidia's GPUs are expensive but offer strong versatility and programmability, essential for model training [11] - However, during the inference phase, the need for such versatility diminishes, allowing for the use of ASICs (Application-Specific Integrated Circuits) [11] - Major tech companies are developing their own chips for inference, which may reduce their reliance on Nvidia, indicating a potential shift in market dynamics [11] Opportunities for Broadcom - Broadcom is positioned as a key player in the AI supply chain, acting as a "contractor" for major tech firms needing chip design and manufacturing [13] - As more companies move towards self-developed chips, Broadcom's order volume is expected to increase, leading to a divergence in stock performance between Broadcom and Nvidia [15] Domestic AI Landscape - The gap between domestic AI large models and those in the U.S. is narrowing, with a focus on practical implementation driving demand for inference computing power [17] - Domestic chip manufacturers have an opportunity in the inference chip market, as the requirements are less stringent compared to training chips [17] - The article advises caution in this overheated market, suggesting a wait-and-see approach before investing [17] Strategic Recommendations - Focus on energy and electricity, particularly in small nuclear reactors and clean energy sectors, for medium to long-term opportunities [18] - Pay attention to computing infrastructure needs, including GPUs, ASICs, servers, and data centers [18] - Consider investing in domestic inference chip companies once market sentiment cools down [18] - Emphasize AI application implementation over merely training larger models, as the true winners will be those who effectively integrate AI into production [18]
英伟达的噩梦来了?OpenAI、谷歌、亚马逊都在造芯