Core Viewpoint - OpenAI is dissatisfied with the performance of several AI chips from Nvidia and has been seeking alternatives since last year, indicating a strategic shift towards specialized chips for AI inference, which is becoming a new battleground in the industry [1][2]. Group 1: OpenAI's Strategic Shift - OpenAI's focus on AI inference chips is increasing, as it seeks to enhance its capabilities in analyzing new, unlabelled data for predictions and decisions [1]. - OpenAI has begun discussions with companies like AMD and Cerebras to procure high-performance inference chips, indicating a diversification of its hardware sources [4][5]. - The company is exploring integrated memory chips that combine SRAM with other components to improve processing speed, as inference demands higher memory access than training [5][6]. Group 2: Nvidia's Position and Response - Nvidia remains a dominant player in the AI training chip market, and its inference chips are still considered the best in terms of performance and total cost of ownership [1][3]. - Nvidia's CEO Jensen Huang has stated that the proposed $100 billion investment in OpenAI is not a formal commitment and will be evaluated on a case-by-case basis [2]. - Despite OpenAI's search for alternatives, Nvidia maintains that its hardware is foundational to OpenAI's operations, emphasizing a deep collaborative relationship [3][6]. Group 3: Market Dynamics and Competition - The competition in the AI inference chip market is intensifying, with OpenAI's search for alternatives seen as a challenge to Nvidia's dominance [1][4]. - Nvidia's recent agreements with companies like Groq for technology licensing may impact OpenAI's negotiations with these firms [4][6]. - OpenAI's internal dissatisfaction with Nvidia's hardware has been particularly noted in its Codex product, which is critical for code generation [5][6].
OpenAI被曝“嫌弃”英伟达(NVDA.US)AI芯片 奥尔特曼亲自回应:疯狂说法毫无依据