Workflow
SOCAMM内存
icon
Search documents
疯狂备货,英伟达盯上了这类存储
半导体行业观察· 2025-07-17 00:50
Core Viewpoint - NVIDIA is set to produce a significant inventory of its modular memory solution, SOCAMM, to enhance the performance and efficiency of its AI products, with an expected production of 600,000 to 800,000 units this year [3][4][9]. Summary by Sections SOCAMM Memory Overview - SOCAMM memory is based on LPDDR DRAM, traditionally used in mobile devices, and is designed to be modular and upgradeable, differing from HBM and LPDDR5X solutions [3][4]. - The bandwidth of SOCAMM memory is approximately 150-250 GB/s, making it a versatile option for AI PCs and servers [5]. Production and Market Impact - The initial production target of 800,000 units is lower than the HBM memory supplied to NVIDIA by its partners, but production is expected to ramp up next year with the introduction of SOCAMM 2 [4][6]. - Major memory manufacturers, including Micron, Samsung, and SK Hynix, are competing to establish a foothold in the emerging SOCAMM market, which is anticipated to be as strategically important as the HBM market [6][10]. Technical Advantages - SOCAMM is designed for low power consumption, with its power requirements being significantly lower than traditional DDR5 modules, making it suitable for large-scale deployment [7][9]. - The modular design allows for easy upgrades, and it is expected to be used alongside HBM chips in NVIDIA's upcoming AI accelerators [7][9]. Competitive Landscape - Micron has begun mass production of SOCAMM modules, claiming a bandwidth over 2.5 times that of RDIMM and a power consumption only one-third of RDIMM [9]. - Samsung aims to regain its leadership in the DRAM market by leveraging its dominance in LPDDR technology to enter the SOCAMM market [10].