LPDDR5X内存

Search documents
疯狂备货,英伟达盯上了这类存储
半导体行业观察· 2025-07-17 00:50
Core Viewpoint - NVIDIA is set to produce a significant inventory of its modular memory solution, SOCAMM, to enhance the performance and efficiency of its AI products, with an expected production of 600,000 to 800,000 units this year [3][4][9]. Summary by Sections SOCAMM Memory Overview - SOCAMM memory is based on LPDDR DRAM, traditionally used in mobile devices, and is designed to be modular and upgradeable, differing from HBM and LPDDR5X solutions [3][4]. - The bandwidth of SOCAMM memory is approximately 150-250 GB/s, making it a versatile option for AI PCs and servers [5]. Production and Market Impact - The initial production target of 800,000 units is lower than the HBM memory supplied to NVIDIA by its partners, but production is expected to ramp up next year with the introduction of SOCAMM 2 [4][6]. - Major memory manufacturers, including Micron, Samsung, and SK Hynix, are competing to establish a foothold in the emerging SOCAMM market, which is anticipated to be as strategically important as the HBM market [6][10]. Technical Advantages - SOCAMM is designed for low power consumption, with its power requirements being significantly lower than traditional DDR5 modules, making it suitable for large-scale deployment [7][9]. - The modular design allows for easy upgrades, and it is expected to be used alongside HBM chips in NVIDIA's upcoming AI accelerators [7][9]. Competitive Landscape - Micron has begun mass production of SOCAMM modules, claiming a bandwidth over 2.5 times that of RDIMM and a power consumption only one-third of RDIMM [9]. - Samsung aims to regain its leadership in the DRAM market by leveraging its dominance in LPDDR technology to enter the SOCAMM market [10].
美光出货首款基于1γ制程节点的LPDDR5X内存
news flash· 2025-06-06 07:29
Core Viewpoint - Micron Technology has announced the shipment of the world's first LPDDR5X memory samples utilizing the 1-gamma (1γ) process node, marking a significant advancement in memory technology [1] Group 1: Product Features - The LPDDR5X memory achieves a speed of 10.7 Gbps while reducing power consumption by up to 20% [1] - The packaging size of the LPDDR5X has been reduced to 0.61 mm, which is a 14% decrease compared to the previous generation [1] Group 2: Market Strategy - Micron has begun shipping samples of the 16GB LPDDR5X product based on the 1γ node to select partners [1] - The company plans to offer a range of capacities from 8GB to 32GB for flagship smartphones expected in 2026 [1]
这类内存,火起来了
半导体行业观察· 2025-03-20 01:19
Core Viewpoint - Micron, Samsung, and SK Hynix have launched SOCAMM, a new memory module designed for AI and low-power servers, which integrates high capacity, performance, small size, and low power consumption [1][3]. Group 1: SOCAMM Overview - SOCAMM is a small memory module measuring 14x90 mm, capable of holding up to four 16-chip LPDDR5X memory stacks, with Micron's initial module offering a capacity of 128GB [1][2]. - Micron's SOCAMM consumes only one-third of the power compared to a 128GB DDR5 RDIMM, presenting a significant advantage in energy efficiency [2][3]. - The SOCAMM modules are expected to be used in Nvidia's GB300 Grace Blackwell Ultra Superchip systems, simplifying server production and maintenance, which could positively impact pricing [3]. Group 2: Technical Specifications and Advantages - SOCAMM is designed to provide a modular solution that can accommodate high memory bandwidth while maintaining low power consumption, with Micron's memory rated at speeds up to 9.6 GT/s and SK Hynix's at 7.5 GT/s [1][2]. - Compared to traditional DRAM, SOCAMM is more cost-effective and may allow LPDDR5X memory to be placed directly on the substrate, enhancing energy efficiency [4]. - SOCAMM features up to 694 I/O ports, significantly more than LPCAMM's 644 or traditional DRAM's 260, and includes a removable module for easy upgrades [4]. Group 3: Industry Implications - Nvidia is independently advancing the SOCAMM standard, which may replace the SO-DIMM standard as the industry shifts focus towards AI workloads requiring substantial DRAM [5]. - The introduction of SOCAMM aligns with Nvidia's strategy to make AI mainstream, as highlighted by CEO Jensen Huang at CES 2025 [5].