Workflow
TechInsights Releases Initial Findings of its NVIDIA Blackwell HGX B200 Platform Teardown
NvidiaNvidia(US:NVDA) GlobeNewswire News Roomยท2025-04-14 14:00

Core Insights - TechInsights released early-stage findings on NVIDIA's Blackwell HGX B200 platform, highlighting its advanced AI and HPC capabilities in data centers [1] - The GB100 GPU features SK hynix's HBM3E memory and TSMC's advanced packaging architecture, marking significant technological advancements [1][2] HBM3E Supplier - The GB100 GPU incorporates eight HBM3E packages, each with eight memory dies in a 3D configuration, achieving a maximum capacity of 192 GB [2] - The per-die capacity of 3 GB represents a 50% increase over the previous generation of HBM [2] CoWoS-L Packaging Technology - The GB100 GPU utilizes TSMC's 4 nm process node and features the first instance of CoWoS-L packaging technology, which significantly enhances performance compared to the previous Hopper generation [3] - The GB100's design includes two GPU dies, nearly doubling the die area compared to its predecessor [3] HGX B200 Server Board - Launched in March 2024, the HGX B200 server board connects eight GB100 GPUs via NVLink, supporting x86-based generative AI platforms [4] - The board supports networking speeds up to 400 Gb/s through NVIDIA's Quantum-2 InfiniBand and Spectrum-X Ethernet platforms [4] TechInsights Overview - TechInsights provides in-depth intelligence on semiconductor innovations, aiding professionals in understanding design features and component relationships [6][7] - The TechInsights Platform serves over 650 companies and 125,000 users, offering extensive reverse engineering and market analysis in the semiconductor industry [8]