Workflow
Tensor Processing Units (TPU)
icon
Search documents
Constellation's Wang on Google-Nvidia Chips Rivalry
Bloomberg Television· 2025-11-26 07:17
AI Chip Landscape - Tensor Processing Units (TPUs) are purpose-built for AI and deep learning, offering lower total costs and greater power efficiency compared to GPUs [1] - Google has been developing TPUs for some time, aiming for efficiency and supply chain diversification beyond Nvidia [2][3] - Google's full-stack approach, from chip to application, provides significant efficiencies of scale [5][6] - Diversifying chip base is crucial, as different chips excel in different tasks, similar to diversifying cloud providers [10][11] Market Demand and Competition - The AI market is projected to reach a $7 trillion market cap by 2030, indicating substantial demand [8] - The market demand is large enough to accommodate multiple players, suggesting it's not a zero-sum game between CPU and GPU [8][9] - Hyperscalers not directly competing with Google, pharmaceutical giants, energy companies, and governments are potential adopters of TPUs [13][14] - AMD and Google are positioned to provide alternatives to Nvidia's dominance in the AI chip market [15] Google's AI Capabilities - Gemini 3 is competitive with other leading large language models like ChatGPT, Claude, and Perplexity, excelling in various use cases [16][17] - Sovereign AI and companies building data centers/physical AI will drive market headlines in 2026 [24] Nvidia's Outlook - Models suggest Nvidia has the potential for another $1 trillion in sovereign AI market cap and another $1 trillion in physical AI market cap, potentially peaking around $6.5 to $7 trillion market cap [22][23]