Alchip
Search documents
AI 供应链:TPUASIC 动态;ICMS 存储芯片需求测算Asia-Pacific Technology-AI Supply Chain TPUASIC updates; ICMS NAND demand calculation
2026-01-28 03:03
January 27, 2026 10:15 PM GMT Asia-Pacific Technology | Asia Pacific AI Supply Chain: TPU/ASIC updates; ICMS NAND demand calculation We sense that AI semi vendors have started securing critical 2027 components – T-Glass/ABF, HBM, and TSMC 3nm – (e.g., MediaTek's 3nm TPU). Downgrade Egis to EW, as its 2026 appears to be shaping up as a year of transition. ASIC – Volume upside for MediaTek's 3nm TPU project in 2027: In our Target Price Up report, we highlighted the supply chain's bull case of 6-7mn TPU units ...
机构:2027年AI AISC出货量将为2024年三倍
Zheng Quan Shi Bao Wang· 2026-01-27 10:51
1月26日,市场研究机构Counterpoint Research在最新研报中预测,非GPU服务器AI芯片——AI ASIC阵 营在近期将经历高速增长,到2027年出货数量将达到2024年的三倍,2028年则有望以1500余万颗的规模 反超GPU。 报告显示,这一爆炸性增长的背后,是对谷歌TPU基础设施的强劲需求、AWS Trainium集群的持续扩 展,以及Meta(MTIA)和微软(Maia)随着其内部芯片产品组合的扩展而带来的产能提升。 就人工智能超大规模数据中心的出货量而言,谷歌预计将在2027年之前保持市场领先地位,这主要得益 于其Gemini生态系统的爆炸式增长。 针对谷歌的统治地位,Counterpoint Research研究员David Wu强调:"尽管由于市场规模不断扩大,以及 竞争对手(如博通、Marvell和Alchip等设计公司)纷纷采用自研芯片,预计谷歌的市场份额将在2027年下 降至52%,但其TPU集群仍将是无可争议的行业核心和标杆。这一基准的支撑源于训练和运行下一代 Gemini模型所需的庞大且持续的计算能力,而这又需要谷歌持续、积极地扩充其内部芯片基础设施。" 该机构认为, ...
AI 供应链:CES 展会影响、ASIC 芯片生产、中国 AI 芯片-Asia-Pacific Technology-AI Supply Chain CES implications, ASIC production, China AI chips
2026-01-07 03:05
Summary of Key Points from the Conference Call Industry Overview - The focus is on the **AI semiconductor industry**, particularly the dynamics surrounding **AI GPUs** and **AI ASICs**. The demand for these components is expected to be strong in 2026, driven by supply factors such as memory availability and TSMC's 3nm technology [1][4][42]. Core Insights - **Nvidia's Production**: Nvidia's management reported that the **Rubin** compute board is in "full production," with assembly time significantly reduced from approximately **2 hours** for Blackwell to about **5 minutes** for Rubin. The launch is anticipated in the **second half of 2026** [2][54]. - **China's AI Chip Demand**: There is a forecast of around **2 million units** of H200 chips demanded by Chinese customers, with ongoing licensing processes. Companies like **ByteDance** are actively developing AI server racks compatible with both Nvidia and local chips [4][84]. - **Market Size Projections**: The total AI chip market is projected to reach **US$550 billion** by **2029**, which includes both AI GPUs and ASICs. This reflects a significant growth trajectory for the sector [5][42]. Capacity and Production Dynamics - **TSMC's CoWoS Capacity**: TSMC is expected to expand its CoWoS capacity by **20-30%** in 2026, with a revised forecast of **125kwpm** by the end of the year, marking a **79% increase** from previous estimates [12][43]. - **ASE/SPIL and Amkor**: Both ASE/SPIL and Amkor are also expanding their CoWoS capacities to meet rising demand from key customers like Nvidia, AMD, and AWS [13][14]. - **Google TPU Production**: Google is accelerating the production of its next-generation **TPU** chips, moving the timeline from **4Q26** to **3Q26**. Broadcom has also booked **30k** of CoWoS-S capacity to meet TPU demand [26][28]. Financial Outlook - **Revenue Growth**: TSMC is projected to generate **US$107 billion** from AI chip foundry services by 2029, which would account for about **43%** of its total revenue [44]. - **Cloud Capex Spending**: Estimated cloud capital expenditure for 2026 is projected to reach **US$632 billion**, indicating robust investment in AI infrastructure [45]. Risks and Considerations - **Supply Chain Risks**: The primary concerns for 2026 are expected to be shortages in memory, T-Glass, and TSMC's 3nm wafers, rather than CoWoS capacity itself [43][42]. - **China's Localization Efforts**: China is expected to increase its local chip production to support AI development, which may create additional demand for both local and foreign chips [81][82]. Additional Insights - **ByteDance's AI Server Racks**: At a recent conference, ByteDance showcased its **256-node AI server racks**, which are designed to work with both Nvidia and local AI chips, highlighting the competitive landscape in China's AI market [84]. - **Market Dynamics**: The AI semiconductor market is characterized by rapid growth and evolving dynamics, with significant implications for companies involved in chip production and supply chain management [42][43]. This summary encapsulates the key points discussed in the conference call, providing insights into the current state and future outlook of the AI semiconductor industry.
Marvell’s CEO Says the Company Didn’t Lose Any Orders. Why Was Wall Street So Worried, and How Should You Play MRVL Stock Here?
Yahoo Finance· 2025-12-15 17:18
Core Viewpoint - Microsoft is considering a partnership with Broadcom for a custom AI accelerator, which could negatively impact Marvell Technology's current collaboration with Microsoft on custom ASICs for Azure [1] Company Overview - Marvell Technology has a market cap of $71.6 billion and is a leading supplier of data-infrastructure semiconductor solutions, focusing on advanced system-on-a-chip architectures [3] - The company's product lineup includes Ethernet solutions, processors, and custom ASICs, along with interconnect solutions and storage controllers [3] Recent Stock Performance - Marvell's shares dropped over 15% following reports of potential loss of key orders from Microsoft and Amazon Web Services, leading to a year-to-date decline of 24% [2][5] - The stock's decline has raised concerns about Marvell's competitive position in the custom AI chip market [2] Management Response - CEO Matt Murphy has publicly denied reports of lost business with Microsoft and Amazon, asserting that Marvell's data center business remains strong [4][7] - Several Wall Street analysts have supported this view, with some calling the negative reports "without merit" and reaffirming buy ratings on Marvell stock [8][9] Financial Performance - Marvell reported third-quarter fiscal 2026 net revenue of $2.08 billion, a 36.8% year-over-year increase, with data center revenue accounting for 73% of total revenue [10] - The company guided for fourth-quarter revenue of approximately $2.2 billion and adjusted EPS of $0.79, aligning with Wall Street estimates [12] Future Outlook - CEO Murphy projected potential revenue growth to $10 billion in fiscal year 2027 and 40% year-over-year growth in fiscal year 2028 [13] - Marvell announced plans to acquire Celestial AI for at least $3.25 billion to enhance its AI capabilities [14] Analyst Sentiment - Despite concerns over potential order losses, the consensus among analysts remains bullish, with a "Strong Buy" rating for Marvell stock and an average price target of $114.70, indicating a 37% upside potential [15]
Marvell (MRVL) Stock Downgraded as Concerns Grow Over Amazon Trainium Transitions
Yahoo Finance· 2025-12-11 09:20
Core Viewpoint - Marvell Technology, Inc. has been downgraded from Buy to Hold by Benchmark due to concerns over losing Amazon's Trainium chip designs to competitor Alchip, which is expected to impact the company's growth projections significantly [1][2]. Group 1: Downgrade and Market Reaction - Benchmark downgraded Marvell's stock following insights from a Silicon Valley bus tour, indicating a high degree of conviction that the company has lost both Amazon's Trainium3 and 4 designs [2]. - The downgrade is expected to be controversial, especially since Marvell has claimed there would not be a revenue "air pocket" from Amazon [2]. - Following the earnings report, Marvell's shares performed well, but the firm recommends investors take short-term profits due to an overly optimistic interpretation of the company's signals regarding Amazon [4]. Group 2: Revenue Projections and Market Dynamics - The firm projects a slowdown in Marvell's growth to only 20% XPU growth in CY26, primarily due to the loss of Amazon's designs [2]. - While Marvell is expected to see increasing annual revenue from Amazon, this is believed to be driven by continued Trainium2 volumes and a Kuiper low-earth orbit engagement rather than a successful transition to Trainium3 designs [3]. - The recently announced Trainium3 is currently only an air-cooled version for customer evaluation, with the liquid-cooled variant not expected until mid-next year, which supports the expectation of continued Trainium2 volumes for Marvell's near-term forecasts [3].
当算力追赶不上智能:2026年AI行业的缺口与爆发(附86页PPT)
材料汇· 2025-12-10 15:51
Core Insights - The rapid evolution of AI is outpacing the development of computing infrastructure, leading to a significant gap in computing power that is expected to widen by 2026. This gap will manifest in two key areas: a growing demand for core computing capabilities across chips, storage, packaging, and cooling, and a shift towards edge computing to reduce cloud latency and costs, resulting in an explosion of applications from AI smartphones to integrated robots [1]. Industry Overview - The electronic sector has reached a record high in Q3 2025, driven by AI, with the electronic index rising by 44.5% year-to-date, outperforming the CSI 300 index by 26.6% [12][13]. - The semiconductor sector has shown significant growth, with various sub-sectors experiencing substantial increases: PCB (+114%), consumer electronics (+51%), and semiconductors (+40%) year-to-date [12][13]. - The overall electronic industry reported a revenue increase of 19% and a net profit increase of 35% in Q1-Q3 2025, with all major segments showing positive growth [18][24]. Performance Metrics - The electronic sector's inventory levels have risen, particularly in consumer electronics and PCBs, indicating strong demand and recovery in terminal markets [22][25]. - The semiconductor sector's monthly sales growth has rebounded since June 2023, with a notable increase in demand for digital, storage, and equipment segments [34][41]. AI Impact on Semiconductor Cycle - The semiconductor market is entering an upward cycle, with significant growth in capital expenditures from both domestic and international cloud service providers, driven by AI demand [41][42]. - Major cloud providers are expected to increase their capital expenditures significantly, with projections indicating a 50%-60% growth in 2026 [43]. Consumer Electronics Trends - Global smartphone sales are projected to recover, with a forecast of 1.29 billion units in 2024, reflecting a 6.1% year-on-year increase [26][27]. - The PC market is also expected to grow, with global sales reaching 263 million units in 2024, a 1.0% increase year-on-year [27][29]. Automotive Sector Insights - The automotive market is experiencing a weak recovery, with global sales expected to reach 92.23 million units in 2025, reflecting a 1.8% year-on-year increase [39]. - The penetration rate of electric vehicles is projected to rise, with expectations of 20% in 2025 for global sales [39]. AI Narrative Acceleration - The competition among AI model developers has intensified, with significant advancements in model capabilities and applications across various sectors [47][50]. - The demand for AI-related spending is expected to reach $3-4 trillion by 2030, driven by the need for enhanced computing power and applications [58]. Edge Computing and Hardware Development - The shift towards edge computing is becoming crucial, with predictions indicating that the global edge AI market will grow to ¥1.2 trillion by 2029, with a CAGR of 39.6% [69]. - Major AI companies are actively entering the edge hardware market to enhance user experience and profitability [69].
瑞银全球半导体_云人工智能_2026 年 N3 代工厂和 CoWoS 产能紧张程度如何-UBS Global I_O Semiconductors_ Cloud AI_ how tight could N3 foundry and CoWoS be in 2026_ [ERRATUM]
瑞银· 2025-11-25 01:19
Investment Rating - The report reiterates a Buy rating on TSMC as the leading Cloud/Edge AI foundry [4] - ASE is also rated as a Buy due to its position as a key beneficiary of advanced packaging and testing [4] - GPTC is upgraded to a Buy with a new price target of NT$1,800, implying a 35% upside [4][22] Core Insights - TSMC's N3 capacity is forecasted to reach 170kwpm by the end of 2026, up from 120kwpm at the end of 2025 [1] - Cloud AI is expected to account for 35-40% of N3 demand in 2026, with smartphones and PCs making up 60-65% [1] - CoWoS capacity is anticipated to be tight in 2026, with TSMC likely to accelerate capacity expansion [2] - Demand forecasts for CoWoS have been raised significantly for Nvidia (13% increase), AMD (56% increase), and Broadcom [3] Summary by Sections N3 Foundry Supply-Demand Analysis - TSMC's N3 capacity is projected to increase to 170kwpm by end-2026 from 120kwpm at end-2025 [1] - Cloud AI products are expected to represent 35-40% of N3 demand in 2026, with other devices accounting for 60-65% [1] - N3 utilization is expected to be tight, particularly in Q4 2026 [1] CoWoS Capacity and Demand - CoWoS capacity is forecasted to reach 110kwpm by Q3 2026, with potential upside in late 2026 [2] - Demand for CoWoS from Nvidia is expected to reach 3 million units in 2026, with AMD's demand forecast raised by 56% [3] - Broadcom's CoWoS demand is projected to increase to 260-280k units in 2026, up from 90-100k in 2025 [3] Stock Recommendations - TSMC's capex for 2026/27 is raised to US$50bn/52bn from US$46bn/50bn, with a price target increase to NT$1,800 [4] - ASE is highlighted as a key beneficiary of advanced packaging and testing [4] - GPTC's long-term earnings CAGR is forecasted at 20% over 2027-29, with a significant market share in advanced packaging [22]
人工智能供应链 台积电为满足主要人工智能客户增长需求扩大 3 纳米产能-Asia-Pacific Technology-AI Supply Chain TSMC to expand 3nm capacity for major AI customer's growth
2025-11-13 02:49
Summary of TSMC and AI Supply Chain Conference Call Industry Overview - The conference call focuses on the semiconductor industry, particularly TSMC's role in the AI supply chain and its capacity expansion plans for 3nm wafers in response to increasing demand from major AI customers like Nvidia and AMD [1][2][11]. Key Points and Arguments TSMC's Capacity Expansion - TSMC is considering expanding its 3nm wafer capacity by an additional 20,000 wafers per month (kwpm) in Taiwan, which could increase its 2026 capital expenditure (capex) to between US$48 billion and US$50 billion, up from the previously expected US$43 billion [3][12]. - The expansion is driven by strong demand from major customers, particularly Nvidia, which has indicated a need for more capacity during a recent visit by its CEO [2][11]. Constraints and Challenges - The main constraint for TSMC's expansion is the availability of clean room space, as all new clean room facilities are allocated for 2nm expansion. TSMC may relocate some 22nm/28nm production from Fab 15 to free up space for 3nm expansion [3][12]. - There is a noted shortage of 3nm wafers, which has affected several customers, including Nvidia, AMD, and Alchip [11]. CoWoS Capacity and Demand - TSMC's CoWoS (Chip on Wafer on Substrate) capacity is expected to be sufficient to meet the projected demand from Nvidia's Rubin chips, despite concerns about potential bottlenecks in front-end capacity and materials like T-glass [4][18]. - The analysis indicates that the total implied CoWoS consumption for TSMC could reach 629,000 wafers, with significant contributions from partnerships with OpenAI and AMD [21]. Stock Implications - The potential increase in 3nm capex is viewed positively for global semiconductor capital sentiment. Morgan Stanley maintains an "Overweight" rating on TSMC and other related companies, anticipating better growth in AI semiconductors [6]. Customer Demand Breakdown - The demand for TSMC's 3nm node is projected to grow significantly, with estimates of 110-120 kwpm in 2025 and 140-150 kwpm in 2026, potentially reaching 160-170 kwpm with the new expansion [11][13]. - Major customers include Nvidia, AMD, and AWS, with Nvidia expected to account for a substantial portion of the demand [28]. Additional Important Insights - The conference call highlighted the importance of TSMC's strategic decisions regarding capacity allocation and customer relationships, particularly in the context of the rapidly evolving AI landscape [2][4]. - The analysis of power deployment plans indicates a strong correlation between AI chip demand and CoWoS capacity, suggesting that TSMC's ability to meet this demand will be critical for its future growth [18][21]. This summary encapsulates the key discussions and insights from the conference call, focusing on TSMC's strategic capacity expansions and the implications for the semiconductor industry in the context of AI demand.
Marvell 对比 Broadcom 对比 Alchip 对比 GUC —— 关于 ASIC 投资的最新动态 --- Marvell vs. Broadcom vs. Alchip vs. GUC – Update on ASIC Plays
2025-11-10 03:34
Summary of ASIC Industry Update Industry Overview - The document provides an update on the ASIC (Application-Specific Integrated Circuit) projects of major North American hyperscaler companies, including AWS, Microsoft, Meta, Google, OpenAI, Apple, and TikTok [2][3] Key Companies and Their ASIC Projects AWS (Amazon Web Services) - **Tranium 2 Chip**: Expected to reach its end phase in Q4 2025, with a transitional chip, Tranium 2.5, to be produced in Q4 2025 and Q1 2026. Marvell is expected to ship approximately 200,000 units per quarter [4][3] - **Tranium 3 Chip**: Forecasted production volume of around 2.5 million units, with potential allocation of up to 500,000 units to Marvell if Tranium 2.5 production is successful [8][9] - **Tranium 4 Chip**: Designed by Annapurna and Alchip, expected to start mass production in Q4 2027 [9][10] Microsoft - **Cobalt 200 CPU and MAIA 200 Sphinx**: Designed by GUC, with MAIA 300 Griffin facing challenges in its development with Marvell. Microsoft may shift to Broadcom if confidence in Marvell wanes [14][16] - **MAIA 200 and MAIA 300**: Part of the second-generation ASIC accelerator series, with the contract with Marvell expiring in H1 2026 [15][16] Meta - **ASIC Roadmap**: Includes multiple generations of chips, with the first-generation inference chip, Artemis, already in mass production. The second-generation training chip, Athena, is set for Q4 2023, and the third-generation chip, Iris, is planned for Q3 2024 [17][18] - **Arke Chip**: A simplified inference-only chip designed by Broadcom and Marvell, expected to help Meta keep pace with NVIDIA's chip iterations [19][20] Google - **TPU Development**: The first-generation ASIC Server CPU, Axion, is designed by Marvell, while the second-generation, Tamar, is designed by GUC. Google expects to produce about 4 million TPUs in 2026, with significant internal use [22][24] - **Demand Surge for Optical Modules**: Due to the increase in TPU production, demand for 1.6T optical modules is expected to rise dramatically from 3 million units in 2025 to 20 million in 2026 [25][26] OpenAI - **Titan 1 and Titan 2 Chips**: Broadcom is developing these chips, with expected shipments of 300,000 units in 2026 and at least 600,000 units in 2027 [28][29] - **Collaboration with ARM**: OpenAI is also working with ARM on ASIC projects, indicating a dual approach to chip development [30][31] Apple - **ASIC Projects**: Apple is customizing two ASIC chips, with mass production not expected before 2027 [32][33] TikTok - **Neptune Chip**: After negotiations, TikTok is expected to resume mass production of its ASIC chip in Q1 2026, with an anticipated production volume of 500,000 units [34][35] GUC (Global Unichip Corp) - **Controversial Position**: GUC is involved in the production of Google's Tamar CPU but is also engaged in more profitable projects like Tesla's AI5 chip, which could generate significant revenue in 2027 [41][43] Additional Insights - The document highlights the competitive landscape among major players in the ASIC market, with companies like Marvell, GUC, and Broadcom playing crucial roles in the design and production of these chips [41][42] - The anticipated growth in demand for ASIC chips, particularly in the context of AI and machine learning applications, suggests a robust market outlook for the coming years [25][26] This summary encapsulates the key developments and projections within the ASIC industry, focusing on the major players and their respective projects.
大摩上调中芯国际、目前瓶颈不在台积电
傅里叶的猫· 2025-10-21 15:34
Group 1 - Morgan Stanley upgraded SMIC's rating, raising the target price from HKD 40 to HKD 80, anticipating an expansion in leading edge capacity and resolution of equipment bottlenecks [2] - Chinese mobile announced plans to deploy 100,000 local GPU networks by 2028, leading to an updated revenue forecast for China's AI GPU market, projected to reach RMB 113 billion in 2026 and RMB 180 billion in 2027, with a compound annual growth rate of 62% [2] - The report indicates that while NVIDIA's market share in China is nearly zero, there are still opportunities for local suppliers to fill the gap, particularly in AI high-performance computing and other semiconductor demands [2] Group 2 - The bottleneck in the semiconductor market is not expected to be TSMC's capacity but rather specific memory or server rack components, with TSMC reporting stronger-than-expected AI demand [3] - AI cluster sizes are moving towards over 100,000 GPUs, driving new standards in Ethernet design and liquid cooling for AI racks [3] - The semiconductor supply chain is projected to expand significantly by 2026, with a focus on CPO and NAND module manufacturers [4] Group 3 - Global CoWoS consumption is expected to reach 1,154k wafers in 2026, with NVIDIA holding a 59% market share, and HBM consumption projected at 2.6 billion GB [5] - AI capital expenditures remain strong, with cloud capex expected to reach USD 582 billion in 2026, reflecting a 31% annual growth [5] - AI GPU and ASIC rental prices have seen slight declines, but demand for AI inference in China remains robust, indicating a positive outlook for the AI supply chain [5]