Core Insights - Microsoft has deployed its first self-developed AI chip, Maia 200, in a data center and plans to expand its deployment in the coming months, positioning it as a core for AI inference computing power [1] - The Maia 200 chip is optimized for high computational loads during the mass production phase of AI models, boasting performance that surpasses Amazon's latest Trainium chip and Google's Tensor Processing Unit (TPU) [1] - Despite the introduction of its own chip, Microsoft CEO Satya Nadella emphasized the company's ongoing partnerships with Nvidia and AMD, indicating a strategy of not solely relying on vertical integration [1] Summary by Sections Chip Development and Deployment - Microsoft has announced the deployment of the Maia 200 chip, which is designed for AI inference and optimized for high-load scenarios [1] - The chip's performance is reported to exceed that of competitors like Amazon and Google [1] Strategic Partnerships - Nadella highlighted the importance of maintaining relationships with other chip manufacturers, stating that innovation from partners is crucial for future competitiveness [1] - The company will continue to purchase chips from Nvidia and AMD, despite its own chip development [1] Internal Usage and Future Plans - The Maia 200 chip will first be utilized by Microsoft's "Super Intelligence" team, led by Mustafa Suleyman, a co-founder of Google's DeepMind [2] - This initiative aims to reduce reliance on external AI model providers like OpenAI and Anthropic [2] - The chip will also support OpenAI models running on Microsoft's Azure cloud platform, although access to advanced AI hardware remains limited [2]
微软CEO:不会停止采购芯片