Core Insights - Nvidia's CEO Jensen Huang emphasized that next-generation AI will require 100 times more computational power than previous models due to new reasoning approaches that involve step-by-step question answering [1] - Nvidia reported a significant revenue increase of 78% year-over-year, reaching $39.33 billion, with data center revenue, primarily from AI-focused GPUs, soaring 93% to $35.6 billion, now representing over 90% of total revenue [2] - Despite strong earnings, Nvidia's stock experienced a 17% drop on January 27, attributed to concerns over potential performance gains from competitors like DeepSeek, which suggested lower infrastructure costs for AI [3] Company Performance - Nvidia's fourth-quarter earnings exceeded analysts' expectations, showcasing robust growth in both overall and data center revenues [2] - The data center segment, crucial for AI workloads, has become the dominant revenue source for Nvidia, highlighting the company's leadership in the GPU market [2] Competitive Landscape - Huang countered claims from DeepSeek regarding the feasibility of achieving high AI performance with lower infrastructure costs, asserting that reasoning models will necessitate more chips [3] - DeepSeek's open-sourced reasoning model was acknowledged by Huang as a significant advancement in the field, indicating the competitive pressure Nvidia faces [4]
Nvidia CEO Huang says AI has to do '100 times more' computation now than when ChatGPT was released