Workflow
AI算力“卖水人”系列(3):NVIDIA GB200:重塑服务器/铜缆/液冷/HBM价值
Guohai Securities·2024-10-20 02:00

Industry Investment Rating - The report maintains a "Recommend" rating for the computer industry [1] Core Views - NVIDIA's Blackwell series, expected to launch in Q4 2024, is anticipated to enhance the value of four markets: server racks, copper cables, liquid cooling, and HBM [4] - The GB200, part of the Blackwell series, offers significantly higher computational power than the H100, with a 5x increase in AI performance (20 petaFLOPS FP4) [4] - Major CSPs like Meta, Alphabet, Microsoft, and Amazon are increasing their capital expenditures for 2024, with Meta raising its forecast to $37-40 billion [4] - The transition from HGX to MGX server designs in the GB200 NVL72 is expected to boost the value of rack integration, HBM, copper connections, and liquid cooling by 2-10x [4] Server and Component Analysis - The GB200 NVL72 server system, featuring 18 compute trays and 9 switch trays, is designed for high-performance AI workloads, with each compute tray containing 2 Grace CPUs and 4 B200 GPUs [38][40] - The GB200 NVL72 offers 4x faster training performance and 30x faster inference performance compared to the H100, thanks to its second-generation Transformer engine and fifth-generation NVLink [40] - The server system uses NVLink copper cables for internal communication, with over 5,000 NVLink cables in the GB200 NVL72, totaling more than 2 miles in length [61] Copper Connections - The DAC (Direct Attach Copper) market is expected to grow rapidly, with NVIDIA deploying DACs extensively in its AI clusters to minimize power consumption [57] - The global high-speed cable market is projected to reach $2.8 billion by 2028, with DACs maintaining strong growth due to their low power consumption and high-speed capabilities [57] - Amphenol leads the global DAC market with a significant share, while domestic Chinese manufacturers like Luxshare Precision and Zhaolong Interconnect have lower global market shares [59] HBM Market - HBM3E is set to begin mass production in the second half of 2024, with HBM4 expected to launch in 2026 [67][69] - NVIDIA is the largest buyer of HBM, with its H100 and H200 GPUs driving demand for HBM3 and HBM3E, respectively [75] - By the end of 2024, the total DRAM industry's HBM TSV capacity is expected to reach 250K/month, with Samsung and SK Hynix leading in HBM production capacity [71][73] Liquid Cooling - The GB200 NVL72 adopts cold plate liquid cooling, which is more mature and cost-effective compared to immersion cooling, with a PUE of 1.1-1.2 [83][85] - Major manufacturers like Foxconn have developed liquid cooling solutions for the GB200 NVL72, featuring hybrid air-liquid cooling systems [86][87] Investment Recommendations - The report recommends focusing on AI chips, server components (including HBM, copper connections, and liquid cooling), and data center infrastructure as key beneficiaries of the AI compute demand surge [90][91] - Key companies in the AI chip sector include NVIDIA, AMD, and Intel, while server manufacturers like Foxconn, Wistron, and Quanta are highlighted for their role in the GB200 supply chain [91]