Core Insights - Google is launching its most advanced chip, the Ironwood Tensor Processing Unit (TPU), to attract AI companies by providing custom silicon solutions [2][3] - The Ironwood TPU is designed to enhance performance for large AI models and real-time applications, significantly outperforming its predecessor [3][4] - Google is experiencing strong demand for its AI infrastructure, contributing to substantial growth in cloud revenue [5][6] Product Launch - The Ironwood TPU will be available for public use soon, following initial testing and deployment [2] - This chip can connect up to 9,216 units in a single pod, addressing data bottlenecks for demanding AI models [3] - Major clients, such as AI startup Anthropic, are planning to utilize up to 1 million Ironwood TPUs for their models [4] Market Position - Google is competing with Microsoft, Amazon, and Meta in the AI infrastructure space, focusing on custom silicon advantages over traditional GPUs [3] - The company is enhancing its cloud offerings to be more cost-effective and efficient to compete with AWS and Microsoft Azure [4] Financial Performance - In Q3, Google reported cloud revenue of $15.15 billion, marking a 34% year-over-year increase [5] - The company has secured more billion-dollar cloud contracts in the first nine months of 2025 than in the previous two years combined [5] - Google has raised its capital spending forecast for the year to $93 billion, up from $85 billion, to meet increasing demand [5][6]
Google's rolling out its most powerful AI chip, taking aim at Nvidia with custom silicon