Workflow
HBM2e
icon
Search documents
研报 | 英伟达H20出口解禁助力需求释放,预估中国外购AI芯片比例将回升至49%
TrendForce集邦· 2025-07-16 09:05
Core Viewpoint - NVIDIA is expected to resume sales of H20 GPUs in the Chinese market, which will boost local AI and cloud service providers' demand, making H20 a key player in high-end AI chips and increasing HBM demand [1][2]. Group 1: Market Dynamics - TrendForce estimates that NVIDIA's potential push to meet its original shipment targets will increase the proportion of NVIDIA and AMD chips purchased in the Chinese AI market to 49%, up from a previous estimate of 42% due to export restrictions [2]. - The resumption of H20 supply is anticipated to effectively release deployment demand for AI applications in China, particularly among large cloud service providers (CSPs) who will prioritize building their own data center infrastructure [2]. - NVIDIA plans to launch a special version of the RTX PRO 6000 for the Chinese market to meet diverse application needs, including edge AI inference [2]. Group 2: HBM Insights - The H20 expected to ship in 2024 will primarily use HBM3 8hi, with a gradual upgrade to HBM3e 8hi by early 2025, increasing total capacity [2]. - Currently, many self-developed ASIC products in China utilize previously procured HBM2e, but H20 is expected to be favored, leading to an increase in its share of HBM consumption [2]. Group 3: Uncertainties and Opportunities - The development of the AI market in China is influenced by international circumstances, leading to a degree of uncertainty for NVIDIA [2]. - When the export ban on H20 is lifted, local CSPs and OEMs are expected to actively accumulate inventory, while domestic AI suppliers and ecosystems will rapidly develop under supportive policies [2].
国外大厂的HBM需求分析
傅里叶的猫· 2025-06-15 15:50
Core Viewpoint - The article discusses the projected growth in HBM (High Bandwidth Memory) consumption, particularly driven by major players like NVIDIA, AMD, Google, and AWS, highlighting the increasing demand for AI-related applications and the evolving product landscape. Group 1: HBM Consumption Projections - In 2024, overall HBM consumption is expected to reach 6.47 billion Gb, a year-on-year increase of 237.2%, with NVIDIA and AMD's GPUs accounting for 62% and 9% of the consumption, respectively [1] - By 2025, total HBM consumption is projected to rise to 16.97 billion Gb, reflecting a year-on-year growth of 162.2%, with NVIDIA, AMD, Google, AWS, and others contributing 70%, 7%, 10%, 8%, and 5% respectively [1] Group 2: NVIDIA's HBM Demand - NVIDIA's HBM demand for 2024 is estimated at 6.47 billion Gb, with a recent adjustment bringing the total capacity to 6.55 billion Gb [2] - In 2025, NVIDIA's HBM demand is expected to decrease to 2.53 billion Gb, with HBM3e 8hi and 12hi versions making up 36% and 64% of the demand, respectively [2] - Key suppliers for NVIDIA include Samsung and SK hynix, which play crucial roles in the HBM supply chain [2] Group 3: AMD's HBM Demand - AMD's HBM demand for 2025 is projected at 0.20 billion Gb for the MI300 series and 0.37 billion Gb for the higher-end MI350 series [3] - Specific models like MI300X and MI325 are designed to enhance storage density, with capacities reaching 192GB and 288GB, respectively [3] - AMD relies on SK hynix and Samsung for HBM3e 8hi and 12hi versions, which are vital for its production plans [3] Group 4: Google and AWS HBM Demand - Google's HBM demand for 2025 is expected to be 0.41 billion Gb, primarily driven by TPU v5 and v6 training needs [4] - AWS's HBM demand is estimated at 0.28 billion Gb, with Trainium v2 and v3 versions accounting for 0.20 billion Gb and 0.08 billion Gb, respectively [6] - Both companies utilize HBM configurations that enhance their AI training and inference capabilities, with a focus on reducing reliance on external suppliers [5][6] Group 5: Intel's HBM Demand - Intel's HBM demand is relatively small, accounting for about 10% of total demand in 2025, primarily focusing on HBM3e versions [7] - Key suppliers for Intel include SK hynix and Micron, with Intel exploring in-house chip development to reduce supply chain dependencies [7]