Core Insights - OpenAI currently has no plans to use Google's self-developed chips for its products despite initial testing of Google's Tensor Processing Units (TPUs) [1] - OpenAI has started renting Google's AI chips to meet its growing computational demands, marking its first significant use of non-NVIDIA chips [1] - OpenAI aims to reduce inference costs by leveraging Google's TPUs, which are expected to be a cheaper alternative to NVIDIA GPUs [1] - The company continues to actively use NVIDIA GPUs and AMD AI chips while also developing its own chips, with a key milestone of "tape-out" expected this year [1] Industry Dynamics - OpenAI has signed a contract to use Google Cloud services to address its increasing computational needs, indicating an unexpected collaboration between two major competitors in the AI field [2] - Despite the collaboration with Google, most of OpenAI's computational power still comes from GPU servers provided by emerging cloud service company CoreWeave [2]
AI芯片不再依赖英伟达转投谷歌? OpenAI回应