'We're Lucky Because We Have Our Own TPUs,' Says DeepMind CEO Demis Hassabis — Yet Admits AI Memory Shortage Constrains Research
Yahoo Finance·2026-03-02 15:22

Group 1: AI Hardware Constraints - The global AI race is facing hardware limitations, particularly in memory, graphics processing units, and electricity, which are slowing down AI deployment [1] - Demand for Google's AI system, Gemini, exceeds current supply capabilities, indicating a strain on resources for experimentation and research [4] Group 2: Competitive Landscape - China is becoming increasingly significant in the global AI competition, with Chinese developers reportedly narrowing the gap with leading U.S. labs, although they are still several months behind [2] - Notable advancements from Chinese companies like Alibaba and ByteDance highlight the presence of talented AI teams in China, although further breakthroughs are needed for achieving artificial general intelligence [3] Group 3: Impact on Technology Companies - Rising memory prices are affecting various technology companies, with Apple and HP reporting increased costs and anticipating lower financial results due to these pressures [5][6] - Google benefits from designing its own tensor processing units (TPUs), which provides greater control over its computing architecture, but still faces reliance on a limited number of suppliers for key components [7][8]