Workflow
Nvidia Groq 3 LPX
icon
Search documents
Nvidia expects to sell $1 trillion in AI chips through 2027 — and it's pushing further into inference
Business Insider· 2026-03-16 20:48
Core Insights - Nvidia's CEO Jensen Huang announced a new inference system at the GTC conference, marking a significant step to maintain its leadership in the AI sector as inference becomes a critical area of competition [1] - The company anticipates a surge in demand for its AI systems, projecting at least $1 trillion in demand for its Blackwell and Rubin systems by 2027, a substantial increase from the previous estimate of $500 billion through 2026 [1] Group 1 - Nvidia introduced the Groq 3 LPX chip, which can accelerate inference workloads by up to 35 times, integrating technology from AI startup Groq with Nvidia's Vera Rubin architecture [2] - The Groq chip is manufactured by Samsung, with expectations for shipping in the second half of the year [2] - Huang emphasized that the "inflection point of inference has arrived," indicating a pivotal moment in the AI landscape [2] Group 2 - The new inference system builds on a $20 billion deal with Groq, which involved licensing Groq's technology and hiring its top engineers [7] - Nvidia's GPUs continue to dominate the AI market, serving both training and inference purposes, but competition is increasing from various companies developing specialized, cost-effective systems for inference tasks [8] - The emergence of AI agents could significantly boost the demand for inference capabilities [8] Group 3 - Companies like OpenAI are exploring alternatives to Nvidia's hardware due to dissatisfaction with its inference chips, with OpenAI reportedly signing a $10 billion deal with Cerebras for compute resources [9]