Core Insights - NVIDIA announced the BlueField-4 data processor at CES 2026, which supports the NVIDIA inference context memory storage platform, designed as a new AI-native storage infrastructure for the forefront of AI [1][4] - The platform addresses the need for scalable infrastructure to store and share context data generated by AI models, which can expand to trillions of parameters and multi-step reasoning [1][4] - The NVIDIA inference context memory storage platform enhances GPU memory capacity for context memory, enabling high-speed sharing across nodes and improving token processing and energy efficiency by up to 5 times compared to traditional storage [5] Company Developments - NVIDIA's CEO Jensen Huang emphasized that AI is transforming the entire computing stack, including storage, moving beyond simple interactions to intelligent collaborative partners capable of long-term reasoning and memory [2][5] - The platform increases the capacity of key-value (KV) caches and accelerates context sharing between large-scale AI system clusters, improving response times and throughput for multi-turn AI agents [2][5]
黄仁勋:AI彻底改变计算堆栈,现在,存储也加入了这场变革丨直击CES