Investment Rating - The industry rating is "Outperform the Market" indicating that the overall industry return is expected to exceed the market benchmark index by more than 5% in the next 6 months [15]. Core Insights - The competition between NVIDIA and Google in the AI chip market is heavily reliant on TSMC's CoWoS advanced packaging, which is currently a critical bottleneck in the AI chip supply chain [3]. - TSMC's capital expenditure for 2026 is projected to be between $52 billion and $56 billion, reflecting a year-on-year growth of 27% to 37% due to strong AI demand [3]. - NVIDIA is collaborating with Amkor to expand its production capacity in the U.S. from 2026 to 2029, as TSMC reallocates some advanced packaging orders to OSAT manufacturers [3]. - Samsung and Intel are actively enhancing their advanced process capabilities, with Samsung aiming to increase its global 2nm monthly capacity to 21,000 wafers by the end of 2026 [4]. - HBM is identified as a key battleground in the competition between NVIDIA's GPUs and Google's TPUs, influencing both performance limits and the actual deliverable quantities of chips [4]. - NAND and SSD demand is significantly amplified in AI data centers, with NVIDIA's Rubin platform enhancing data sharing and reuse, potentially increasing SSD demand [5]. - There is a rising demand for inference cards as large model vendors seek alternatives to NVIDIA's chips to reduce dependency and costs [6]. Summary by Sections Advanced Process and Packaging - TSMC leads in advanced packaging with CoWoS capacity constraints impacting NVIDIA and Google's AI chip output [3]. - Amkor and ASE are being utilized to alleviate TSMC's capacity pressure, with Amkor investing $5 billion in advanced packaging facilities in Arizona [3][4]. Storage Side - HBM is crucial for the competition between NVIDIA and Google, while on-chip SRAM is emerging as a new direction for inference storage [4]. - The collaboration between NVIDIA and Groq focuses on inference technology utilizing on-chip SRAM [4]. Client Side - Major AI model vendors are diversifying their computational resources, with Anthropic planning to deploy up to 1 million TPUs by 2026 and OpenAI partnering with Cerebras for a large-scale AI inference platform [6]. Investment Recommendations - The report suggests focusing on sectors within the semiconductor supply chain, including foundries, advanced packaging, storage, and AI model applications, amidst the competitive landscape between NVIDIA and Google [7].
英伟达GPU VS谷歌TPU:哪些产业链竞争激烈?:传媒