Akamai Inference Cloud
Search documents
Akamai Inference Cloud Transforms AI from Core to Edge with NVIDIA
Prnewswireยท 2025-10-28 17:57
Core Insights - Akamai Technologies has launched Akamai Inference Cloud, a platform designed to enhance AI inference capabilities by moving processing closer to users and devices, thereby reducing latency and improving performance [1][2][4] Company Overview - Akamai Inference Cloud leverages Akamai's global edge network, which consists of over 4,200 locations, and NVIDIA's advanced AI infrastructure to provide scalable, low-latency AI processing [4][5] - The platform aims to support the next generation of AI applications, including personalized digital experiences and real-time decision systems, by enabling intelligent, agentic AI inference at the edge [2][3] Technological Advancements - The platform integrates NVIDIA RTX PRO Servers and BlueField DPUs, enhancing the ability to process AI workloads efficiently from core to edge [4][5] - Akamai Inference Cloud is designed to facilitate streaming inference and agentic AI workflows, allowing for near-instantaneous responses and improved user engagement [5][9] Market Positioning - The launch targets 20 initial locations globally, with plans for further expansion, positioning Akamai as a leader in edge AI processing [6] - The collaboration with NVIDIA aims to redefine AI inference by decentralizing data processing and routing requests to optimal models, enhancing the capabilities of smart commerce agents and financial decision-making [5][9]