Workflow
中国银河证券:推理算力重要性提升 光模块等算力细分赛道发展再加速

Core Insights - The importance of inference computing power is increasing, with significant growth expected in related sectors such as optical modules and chips, driven by advancements in hardware and software from companies like NVIDIA [1][4] Group 1: Inference Computing Power Growth - Inference computing power is projected to continue growing, with NVIDIA's CEO stating that the demand for computing power will be 100 times greater than in the past to support advancements like AGI and embodied intelligent robots [1][2] - The number of tokens processed by models has increased to over 100 trillion, with inference models requiring 20 times more tokens and 150 times more computational power than before [2] - NVIDIA's Blackwell architecture shows a performance improvement of 68 times over the previous Hopper architecture, leading to an 87% reduction in costs [2] Group 2: Hardware and Software Developments - NVIDIA introduced the upgraded Blackwell Ultra architecture, emphasizing its potential to generate 50 times more revenue for data centers, with a clear development roadmap extending to 2026 and beyond [3] - The launch of the open AI engine stack, Nvidia Dynamo, aims to simplify inference deployment and scaling, potentially creating a new paradigm for efficiency in hardware and software [3] - The introduction of Nvidia Llama Nemotron is expected to serve as a foundational model for inference, facilitating exploration in related areas and forming an ecosystem [3] Group 3: Investment Recommendations - The current landscape indicates that the demand for computing power is not declining but is instead being stimulated by the growth of inference applications, suggesting substantial investment opportunities in the sector [4] - Recommended investment targets include telecom operators such as China Mobile, China Unicom, and China Telecom, as well as companies in optical modules and chips like Zhongji Xuchuang, Xinyi Guosheng, and others [4]