Workflow
英伟达服务器芯片
icon
Search documents
首次使用“非英伟达”芯片!OpenAI租用谷歌TPU,降低推理计算成本
硬AI· 2025-06-28 13:24
Core Viewpoint - OpenAI has begun renting Google's TPU chips for the first time on a large scale, marking a significant shift away from its reliance on NVIDIA chips, which have dominated the AI chip market [2][4]. Group 1: OpenAI's Shift to Google TPU - OpenAI's decision to use Google TPU chips is driven by a surge in demand for computational power, with paid ChatGPT subscriptions increasing from 15 million to over 25 million since the beginning of the year [2][3]. - The collaboration with Google allows OpenAI to reduce its dependency on Microsoft's data centers and presents an opportunity for Google to challenge NVIDIA's GPU market dominance [2][4]. - OpenAI's spending on NVIDIA server chips exceeded $4 billion last year, with projections indicating that spending on AI chip services could approach $14 billion by 2025 [3]. Group 2: Industry Trends and Competitors - The increasing demand for AI capabilities has prompted several companies, including Amazon, Microsoft, OpenAI, and Meta, to develop their own inference chips to lessen reliance on NVIDIA [2][4]. - Meta is also considering the use of TPU chips, indicating a broader trend among major AI players to explore alternatives to NVIDIA [5]. - Google has been developing TPU chips for about a decade and has been offering this service to cloud customers since 2017, with other companies like Apple and Cohere also utilizing Google Cloud's TPU [4][5].