Workflow
NVIDIA GB300 Grace Blackwell Ultra Superchip
icon
Search documents
SuperX Launches Rack-Scale AI Platform Powered by NVIDIA GB300 Chips, Redefining Data Center Infrastructure
Prnewswire· 2025-10-16 10:05
Core Insights - SuperX AI Technology Limited has launched the SuperX GB300 NVL72 System, a cutting-edge AI supercomputing platform utilizing the NVIDIA GB300 Grace Blackwell Ultra Superchip, aimed at overcoming the limitations of traditional data center infrastructure [1][2][3] Product Overview - The SuperX GB300 NVL72 System delivers up to 1.8 exaFLOPS of AI performance in a single liquid-cooled rack, representing a significant advancement in compute density and energy efficiency [2][5] - The system integrates 72 NVIDIA Blackwell Ultra GPUs and 36 NVIDIA Grace CPUs, achieving a chip-to-chip link bandwidth of 900GB/s and a total memory of 165TB HBM3E and 17TB LPDDR5X [4][6] Technological Advancements - The system's design necessitates advanced power solutions, particularly 800 Voltage Direct Current (800VDC), to ensure stability and operational viability for next-generation workloads [3] - The GB300 System is optimized for massive horizontal scaling, allowing for the connection of multiple GPUs into a single system, thus redefining standards for large-scale AI training and inference [5][6] Market Positioning - SuperX positions the GB300 NVL72 System as a foundational infrastructure for organizations developing future AI capabilities, catering to sectors such as hyperscale computing, scientific research, and industrial applications [7][10]
Micron Innovates From the Data Center to the Edge With NVIDIA
Globenewswire· 2025-03-18 20:23
Core Insights - Micron Technology, Inc. is the first and only memory company shipping both HBM3E and SOCAMM products for AI servers, reinforcing its leadership in low-power DDR for data center applications [1][2][3] Product Innovations - Micron's SOCAMM, developed in collaboration with NVIDIA, supports the NVIDIA GB300 Grace Blackwell Ultra Superchip, enhancing AI workload performance [2][4] - The HBM3E 12H 36GB offers 50% increased capacity and 20% lower power consumption compared to competitors' offerings, while the HBM3E 8H 24GB is also available for various NVIDIA platforms [6][15] - SOCAMM is described as the fastest, smallest, lowest-power, and highest capacity modular memory solution, designed for AI servers and data-intensive applications [5][10] Performance Metrics - SOCAMM provides over 2.5 times higher bandwidth at the same capacity compared to RDIMMs, allowing for faster access to larger datasets [10] - The HBM3E 12H 36GB provides significant power savings and improved computational capabilities for GPUs, essential for AI training and inference applications [4][6] Market Positioning - Micron aims to maintain its technology momentum with the upcoming HBM4 solution, expected to boost performance by over 50% compared to HBM3E [7] - The company showcases a complete AI memory and storage portfolio at GTC 2025, emphasizing collaboration with ecosystem partners to meet the growing demands of AI workloads [3][8] Storage Solutions - Micron's SSDs, including the 61.44TB 6550 ION NVMe SSD, are designed for high-performance AI data centers, delivering over 44 petabytes of storage per rack [11] - The integration of Micron LPDDR5X memory on platforms like NVIDIA DRIVE AGX Orin enhances processing performance while reducing power consumption [11]