Core Insights - AMD predicts that within the next five years, 5 billion people will use AI daily, surpassing half of the world's population, with active AI users skyrocketing from 1 million to over 1 billion since the launch of ChatGPT in late 2022 [1][3] - The company aims to address the significant computational power gap in AI, with a focus on providing the necessary infrastructure for AI proliferation [3][10] AI Infrastructure and Products - AMD introduced the Helios platform, which integrates 72 AI GPUs, achieving a computational power of 2.9 ExaFLOPS, designed for large-scale AI model training [3][9] - The next-generation MI455 GPU, part of the Helios platform, features over 300 billion transistors and offers up to a 10x performance improvement compared to the previous MI300 series [10][12] - AMD's upcoming MI500 series, set to launch in 2027, aims for a 1000x performance increase over four years, marking a significant leap in AI computing capabilities [12][14] Local AI Solutions - AMD is working on local AI solutions through the Ryzen AI Max 400 series, which integrates a neural processing unit (NPU) for offline AI operations [5][17] - The Ryzen AI Max 400 series is designed for high-performance applications, supporting up to 12 CPU cores and 60 TOPS of dedicated AI processing power [17][18] - A portable AI development system, Ryzen AI Halo, can run models with up to 200 billion parameters offline, catering to professionals needing mobile AI capabilities [22] Spatial Intelligence and Future Directions - AI is transitioning from language intelligence to spatial intelligence, emphasizing the need for AI to understand and interact with the real world [25][26] - Li Fei-Fei, a prominent figure in AI, highlighted the importance of developing world models that can learn 3D/4D structures and spatial relationships, significantly enhancing AI's capabilities [26][27] Consumer Graphics and AI CPUs - AMD launched the Radeon RX 9070 and RX 9070 XT graphics cards, featuring the new RDNA 4 architecture and AI image technologies, enhancing gaming experiences [28][29] - The EPYC Venice server CPU, designed for AI data centers, will support up to 256 high-performance cores and aims to double memory and GPU bandwidth, ensuring efficient data supply to AI GPUs [33]
苏妈和李飞飞炸场CES,AMD AI全栈野心显露:从云端到个人PC,AI芯片性能四年要飙1000倍