Core Insights - Microsoft has announced the Maia 200, a next-generation AI chip designed to compete with Nvidia and offerings from Amazon and Google [2][3] - The Maia 200 is touted as the most efficient inference system Microsoft has ever deployed, with plans for wider customer availability in the future [3] Chip Development and Features - The Maia 200 follows the Maia 100, which was not made available for cloud clients, and is expected to have broader accessibility [3] - The chip utilizes Taiwan Semiconductor Manufacturing Co.'s 3 nanometer process and connects four chips within each server using Ethernet cables [6] Performance and Efficiency - The Maia 200 offers 30% higher performance than competing alternatives at the same price point, with more high-bandwidth memory than Amazon's Trainium and Google's tensor processing unit [7] - Microsoft can connect up to 6,144 Maia 200 chips together, optimizing energy usage and reducing total cost of ownership [7] Application and Deployment - The new chip will be used by Microsoft's superintelligence team and in products like Microsoft 365 Copilot and Microsoft Foundry [4] - Microsoft is equipping its U.S. Central data centers with Maia 200 chips, with plans to expand to other regions [5]
Microsoft reveals second generation of its AI chip in effort to bolster cloud business