Workflow
AMD, Marvell, Intel: Which Is The Next Multi-Trillion Chip Stock
Forbesยท2025-10-09 12:15

Core Insights - AMD has entered a significant agreement with OpenAI to supply tens of thousands of GPU chips, amounting to 6 gigawatts of computing power over five years, marking one of the largest chip acquisitions in the AI industry [2] - The AI computing race is shifting focus from training large language models to inference, which is crucial for real-world applications, leading to increased demand for efficient computing solutions [3][4] - Morgan Stanley projects approximately $3 trillion will be invested in AI over the next three years, with a significant portion likely directed towards inference, potentially surpassing training in revenue and GPU units shipped [4] AMD's Position - The partnership with OpenAI positions AMD as a serious contender in the inference market, offering competitive performance and cost advantages compared to Nvidia [7] - AMD's MI series chips are becoming attractive alternatives for organizations that cannot afford Nvidia's top-tier GPUs, providing solid performance for inference tasks [7] Nvidia's Market Dynamics - Nvidia is expected to maintain its leadership in the AI market due to its established software ecosystem and partnerships, although its market share may decline as competition increases [5][6] - The company's dominance in training with its H100 and A100 GPUs may be challenged as the focus shifts to inference, which requires energy efficiency and hardware availability [3][4] Competitive Landscape - Intel is positioned to capture a share of the inference market with its diverse portfolio, including CPUs and accelerators, despite lagging in cutting-edge GPU technology [8] - ASICs are gaining traction for large-scale inference workloads due to their cost and energy efficiency, with companies like Marvell and Broadcom poised to benefit from this trend [8] Hyperscaler Strategies - Major tech companies like Amazon, Alphabet, and Meta are developing their own AI chips to reduce costs and gain supply control, which may decrease their reliance on Nvidia's GPUs [9] - Chinese companies such as Alibaba and Baidu are also enhancing their AI chip capabilities, with Alibaba planning to launch a new inference chip to support its cloud division [9] Infrastructure Demand - The growth of AI inference workloads will drive demand for supporting infrastructure, emphasizing the need for fast and reliable networking solutions from companies like Arista Networks and Cisco [9]