AMD Instinct MI300X GPUs

Search documents
Improve AI accelerator performance with AMD EPYC™ AI host processors
AMD· 2025-07-22 15:01
Performance Comparison - AMD EPYC 9575F CPU achieves approximately 1.24x higher total token throughput performance on Mixtral 8x7B compared to Intel Xeon 8592+ [1] - AMD EPYC 9575F CPU demonstrates roughly 1.10x greater total token throughput performance on Llama4-Maverick-17B-128E-FP8 versus Intel Xeon 8592+ [1] - AMD EPYC 9575F CPU exhibits about 1.05x improved AI inference throughput performance on Deepseek-R1-SGLang when compared to Intel Xeon 8592+ [1] - Both AMD EPYC 9575F and Intel Xeon 8592+ were tested with 8x AMD Instinct MI300X GPUs [1] Product & Technology - AMD EPYC 9575F is positioned as the highest performance CPU for hosting AI accelerators [1] Legal & Trademark - ©2025 Advanced Micro Devices, Inc [1] - AMD and the AMD Arrow Logo are trademarks of Advanced Micro Devices, Inc in the United States and other jurisdictions [1]
Seekr Selects Oracle Cloud Infrastructure to Deliver Trusted AI to Enterprise and Government Customers Globally
Prnewswire· 2025-06-12 12:00
Core Insights - Seekr has entered a multi-year agreement with Oracle Cloud Infrastructure (OCI) to enhance enterprise AI deployments and develop next-generation vision-language foundation models [1][2][3] - The partnership aims to leverage OCI's AI infrastructure powered by AMD Instinct MI300X GPUs for efficient and secure AI model training and deployment [2][3] - SeekrFlow™, Seekr's AI software platform, will utilize OCI's capabilities to optimize GPU usage and scale large language models (LLMs) globally [2][4] Company Overview - Seekr is a privately held AI company focused on providing trustworthy and transparent AI solutions for enterprise and government clients [6] - The company offers an end-to-end Enterprise AI platform that includes data preparation, analysis capabilities, and tools for building domain-specific LLMs and Agentic AI solutions [6] Partnership Details - The collaboration between Seekr, OCI, and AMD aims to accelerate the availability of trusted AI solutions, particularly in the federal government sector [3][4] - OCI's infrastructure is designed to handle demanding AI workloads, providing flexibility in pricing and easier migration of on-premises applications [4] Technical Capabilities - OCI's AI infrastructure enables efficient multi-node training and inference, allowing Seekr to train LLMs at a lower cost while optimizing performance [3][4] - The partnership emphasizes the importance of massive GPU compute capacity for developing advanced AI models, particularly for analyzing extensive imagery and sensor data [3]