Workflow
英伟达芯片主导地位受冲击 OpenAI转投谷歌TPU
Huan Qiu Wang·2025-06-29 04:06

Group 1 - OpenAI has begun renting Google TPU chips to support its products like ChatGPT, marking its first large-scale use of non-NVIDIA chips [1][3] - This move aims to reduce reliance on Microsoft's data centers and lower inference computing costs, as ChatGPT's paid subscribers have surged from 15 million at the beginning of the year to over 25 million [3] - OpenAI's spending on NVIDIA server chips exceeded $4 billion last year and is projected to approach $14 billion by 2025 [3] Group 2 - Google has developed TPU chips for about a decade and started offering them to cloud customers in 2017, with other companies like Apple and Meta also renting them [3][4] - Despite providing TPU access to OpenAI, Google retains more powerful TPU versions for its own AI teams and Gemini models [3] - Google Cloud continues to rent NVIDIA-supported servers due to their industry standard status and higher profitability, having ordered over $10 billion worth of the latest Blackwell server chips from NVIDIA [4]