Workflow
这类内存,火起来了
半导体行业观察·2025-03-20 01:19

Core Viewpoint - Micron, Samsung, and SK Hynix have launched SOCAMM, a new memory module designed for AI and low-power servers, which integrates high capacity, performance, small size, and low power consumption [1][3]. Group 1: SOCAMM Overview - SOCAMM is a small memory module measuring 14x90 mm, capable of holding up to four 16-chip LPDDR5X memory stacks, with Micron's initial module offering a capacity of 128GB [1][2]. - Micron's SOCAMM consumes only one-third of the power compared to a 128GB DDR5 RDIMM, presenting a significant advantage in energy efficiency [2][3]. - The SOCAMM modules are expected to be used in Nvidia's GB300 Grace Blackwell Ultra Superchip systems, simplifying server production and maintenance, which could positively impact pricing [3]. Group 2: Technical Specifications and Advantages - SOCAMM is designed to provide a modular solution that can accommodate high memory bandwidth while maintaining low power consumption, with Micron's memory rated at speeds up to 9.6 GT/s and SK Hynix's at 7.5 GT/s [1][2]. - Compared to traditional DRAM, SOCAMM is more cost-effective and may allow LPDDR5X memory to be placed directly on the substrate, enhancing energy efficiency [4]. - SOCAMM features up to 694 I/O ports, significantly more than LPCAMM's 644 or traditional DRAM's 260, and includes a removable module for easy upgrades [4]. Group 3: Industry Implications - Nvidia is independently advancing the SOCAMM standard, which may replace the SO-DIMM standard as the industry shifts focus towards AI workloads requiring substantial DRAM [5]. - The introduction of SOCAMM aligns with Nvidia's strategy to make AI mainstream, as highlighted by CEO Jensen Huang at CES 2025 [5].