Core Insights - Compal Electronics has launched its new high-performance server platform SG720-2A/OG720-2A, designed for generative AI and large language model training, featuring AMD Instinct™ MI355X GPU architecture and advanced liquid cooling options [1][3][6] Technical Highlights - The SG720-2A/OG720-2A supports up to eight AMD Instinct MI350 Series GPUs, enabling scalable training for LLMs and generative AI applications [7] - It incorporates a dual cooling architecture, including air and two-phase liquid cooling, optimized for high thermal density workloads, enhancing thermal efficiency [7] - The server is built on the CDNA 4 architecture with 288GB HBM3E memory and 8TB/s bandwidth, supporting FP6 and FP4 data formats, tailored for AI and HPC applications [7] - High-speed interconnect performance is achieved through PCIe Gen5 and AMD Infinity Fabric™, facilitating multi-GPU orchestration and reducing latency [7] - The platform is compatible with mainstream open-source AI stacks like ROCm™, PyTorch, and TensorFlow, streamlining AI model integration [7] - It supports EIA 19" and ORv3 21" rack standards with a modular design for easy upgrades and maintenance [7] Strategic Collaboration - Compal has a long-standing collaboration with AMD, co-developing solutions that enhance efficiency and sustainability in data center operations [5] - The launch of SG720-2A/OG720-2A at both Advancing AI 2025 and ISC 2025 highlights Compal's commitment to expanding its global visibility and partnerships in the AI and HPC sectors [7]
COMPAL Optimizes AI Workloads with AMD Instinct MI355X at AMD Advancing AI 2025 and International Supercomputing Conference 2025