Workflow
B200
icon
Search documents
X @外汇交易员
外汇交易员· 2025-11-26 12:28
The Information:中国监管机构阻止字节跳动在新数据中心使用英伟达芯片。外汇交易员 (@myfxtrader):路透:中国政府已发布指导意见,要求所有获得国家资金的新数据中心项目只能使用国产AI芯片。完成度低于30%的数据中心拆除所有已安装的外国芯片,或取消购买计划;而对于完成度更高的项目,将视具体情况而定。新的指导方针涵盖了英伟达的H20芯片,也涵盖了更强大的处理器,例如B200和H200。 ...
谷歌训出Gemini 3的TPU,已成老黄心腹大患,Meta已倒戈
3 6 Ke· 2025-11-25 11:44
谷歌不再甘当「云房东」,启动激进的TPU@Premises计划,直接要把算力军火卖进Meta等巨头的自家后院,剑指英伟达10%的营收。旗舰TPU v7在算 力与显存上彻底追平英伟达 B200,谷歌用「像素级」的参数对标证明:在尖端硬件上,黄仁勋不再寂寞。通过拥抱PyTorch拆解CUDA壁垒,谷歌正在用 「私有化部署+同级性能」的组合拳,凿开万亿芯片帝国的坚固城墙。 在这个万亿美金的AI赛道上,黄仁勋他的英伟达帝国一直享受着「无敌的寂寞」。 如果你想训练最顶尖的模型,你得去买英伟达的卡; 如果你嫌贵,你也只能去租云厂商手里英伟达的卡。 但就在这个深秋,谷歌决定不再仅仅做一个「房东」,它要开始做「军火商」了。 据知情人士透露,谷歌正在酝酿一项代号为TPU@Premises的激进计划,试图打破英伟达对高端AI芯片市场的绝对垄断。 这一计划的核心极具颠覆性,谷歌不再强制客户必须在谷歌云里使用TPU,而是允许客户将这些算力怪兽直接搬进自家的数据中心。 这场突袭的第一个目标,正是英伟达最大的客户之一——Meta。 扎克伯格的算盘,与几十亿美金的赌注 Gemini 3在技术上抹平了与OpenAI的差距,而它完全是在TPU ...
若H200放开,我们会接受吗?
是说芯语· 2025-11-22 23:55
以下文章来源于傅里叶的猫 ,作者猫叔 傅里叶的猫 . 曾任芯片EDA大厂资深工程师,聊技术、聊产业、聊投资 H200放开的消息昨天已经传的沸沸扬扬了,国内的新闻基本都是这样写的: 但这个新闻最早是出自彭博,比路透要早2个多小时。 而彭博的新闻是下面这个写的,也就是说根据彭博的这个描述,目前只是初步讨论,而且完全有可能只 是停留在讨论,永远不会放开。 这事还得回溯到前段时间中美领导层见面,川普说会谈到Blackwell,大家都以为B30A会放开。后来的 事大家也都知道了,川普说没有谈Blackwell。 但又过了两天,WSJ上的消息说是因为川普的高级顾问们都反对,所以才没有谈,我们当时在星球中就 发过这个: 两国领导开会那天上午,有朋友就发我这样的截图: 所以可能高端的Hopper要放开的事也讨论了很久了。 说话正题,这次的说法是H200要放开,先看下H200的性能: | Specification | H100 | H200 | | --- | --- | --- | | GPU Architecture | Hopper | Hopper | | GPU Memory | 80 GB HBM3 | 141 ...
H200将出口中国?美国正在考虑
半导体行业观察· 2025-11-22 03:09
公众号记得加星标⭐️,第一时间看推送不会错过。 来 源 : 内容编译 自 chosun,谢谢 。 比目前允许使用的H20威力更大的H200,正因美国出口政策的转变而面临审查。 据报道,美国正在考虑是否允许向中国出口英伟达的图形处理器(GPU)。 据彭博新闻社21日(当地时间)报道,美国正在内部讨论是否允许向中国出口用于人工智能(AI) 的GPU"H200"。 H200 于 2023 年发布,基于上一代"Hopper"架构,是同类 AI 芯片中性能最高的。虽然它落后于采 用最新"Blackwell"架构的 B200,但其性能优于同代低配版的"H20"芯片,而美国目前允许"H20"芯 片出口到中国。 然而,有消息人士称,尚未做出最终决定,讨论也可能不会带来实际的出口批准。 此前,特朗普总统曾表示,虽然向中国出售英伟达的半导体产品是可能的,但最先进的产品不应该出 售。在本月初的一次采访中,特朗普总统在谈到人工智能半导体销售时表示,"我会让他们(中国) 处理与英伟达的这个问题",但他补充说,"除了美国,没有人能拥有最先进的半导体"。 英国财政大臣斯科特·贝森特也曾表示,Blackwell芯片只能在一两年后,即不再被视 ...
财报前瞻 | AI芯片霸主英伟达(NVDA.US)再临大考,华尔街押注“超预期+上调指引“
智通财经网· 2025-11-17 04:03
智通财经APP获悉,英伟达(NVDA.US)将在11月19日盘后公布2026财年第三季度财报,预计其盈利将再次超出预期,调整后每股收益预计为1.26美元;市场 还预计该公司本季度营收为营收为552.8亿美元,较去年同期增长超过55%。 过去一年,英伟达的营收增长受到开发生成式AI模型所需芯片的强劲需求推动。英伟达主导着生成式AI芯片市场,这些芯片已被证明在多个行业有用,包 括营销、广告、客户服务、教育、内容创作、医疗保健、汽车、能源与公用事业以及视频游戏开发。 各行业工作流程现代化的需求日益增长,预计将推动对生成式AI应用的需求。根据财富商业洞察的最新报告,全球生成式AI市场规模预计到2032年将达到 9676.5亿美元。预计该市场在2024年至2032年期间的复合年增长率为39.6%。 生成式AI的复杂性需要广泛的知识和巨大的计算能力。这意味着企业将需要显著升级其网络基础设施。英伟达的AI芯片,包括A100、H100、B100、B200、 B300、GB200和GB300,是构建和运行这些强大AI应用的首选,使该公司成为该领域的领导者。随着生成式AI革命的展开,预计英伟达的先进芯片将推动 其营收和市场地位大 ...
英伟达- 本季度应重新聚焦英伟达的市场领导地位
2025-11-16 15:36
Summary of NVIDIA Corp. Conference Call Company Overview - **Company**: NVIDIA Corp. (NVDA.O) - **Industry**: Semiconductors - **Market Cap**: $4,615.28 million - **Current Stock Price**: $186.86 (as of November 13, 2025) - **Price Target**: Increased from $210.00 to $220.00 [1][6][26] Key Points Market Performance and Expectations - The market has improved significantly over the last 45 days, leading to expectations of strong quarterly results as the Blackwell product line ramps up [1][3] - NVIDIA's stock has performed well but has lagged behind AI peers, which is anticipated to change [1][10] Demand and Supply Dynamics - Industry checks indicate a material acceleration in demand, with NVIDIA resolving previous supply chain issues [3][11] - Growth bottlenecks are now more related to complementary hardware (storage, memory, servers) rather than NVIDIA's production capabilities [3][18] - Positive demand signals from customers and suppliers suggest accelerating growth, contrary to consensus expectations that growth has peaked [11][14] Financial Projections - Revenue estimates for the upcoming quarters have been raised, with projections of $55.0 billion for October and $63.1 billion for January, marking the highest sequential revenue growth in the industry’s history [22][27] - FY27 estimates have been increased from $278.0 billion/$6.59 EPS to $298.5 billion/$7.11 EPS, reflecting strong demand and backlog [22][26] Competitive Landscape - NVIDIA's Blackwell remains the preferred AI chip, with strong demand signals noted [10][21] - Despite potential share loss to competitors like AMD, NVIDIA's product leadership is expected to remain solid [21][31] - The company is positioned to benefit from the growing AI market, with significant revenue potential from data centers and generative AI solutions [31][34] Risks and Constraints - While there are no immediate shipment constraints, potential risks include power availability and supply chain issues related to memory and optics [18][21] - The company is cautious about future forecasts, maintaining a conservative approach compared to peers [20][21] Investment Thesis - The stock is rated as Overweight, with a strong conviction in upward revisions to estimates due to NVIDIA's competitive position and growth potential in the AI sector [28][31] - The price target reflects a valuation that is a premium to the semiconductor group but a discount to large-cap AI peers, indicating confidence in NVIDIA's growth trajectory [26][34] Conclusion - NVIDIA is expected to continue its market leadership in the semiconductor industry, driven by strong demand for AI and data center solutions, with financial projections indicating robust growth in the coming years [31][34]
NVIDIA Poised for a Q3 Earnings Surprise: Buy Before the Beat?
ZACKS· 2025-11-14 13:20
Key Takeaways NVIDIA expects Q3 revenues of $54 billion, and the consensus sees 55.6% growth from the year-ago period.Data Center, Gaming, Professional Visualization and Automotive are all projected to post strong gains.Positive Earnings ESP and solid segment trends support expectations for another quarterly beat.NVIDIA Corporation (NVDA) is likely to beat on earnings when it reports third-quarter fiscal 2026 results on Nov. 19, after market close.The company expects revenues of $54 billion (+/-2%) for the ...
买得到芯片的美国科技巨头,买不到电了
虎嗅APP· 2025-11-11 15:17
Core Viewpoint - OpenAI has emerged as a leading player in the AI sector, heavily investing in data centers and GPU acquisitions, but faces significant challenges due to electricity shortages and inefficiencies in energy usage [5][11][12]. Group 1: AI and Power Consumption - The total electricity consumption of data centers in the U.S. reached 176 terawatt-hours (TWh) in 2023, accounting for 4.4% of the national electricity generation, with projections to double by 2028 [11]. - The average Power Usage Effectiveness (PUE) globally in 2024 is expected to be 1.56, indicating that only two-thirds of electricity is used for GPU computing, while the rest is wasted on cooling and other systems [15]. - The inefficiency of AI systems is highlighted, as they consume significant power while having low utilization rates, exacerbating the electricity crisis [10][12]. Group 2: Challenges in the U.S. Energy System - The aging U.S. power infrastructure is struggling to meet the increasing demand from AI technologies, leading to rising electricity costs for consumers [12][13]. - The shift towards nuclear power and the reduction of renewable energy projects have further complicated the energy landscape, making it difficult to sustain the growing needs of AI companies [16][17]. Group 3: Future of AI Chips - Current AI chips like the H100 and A100 are becoming outdated, with newer models (H200, B200, B300) expected to dominate the market by 2025, potentially rendering older chips obsolete if they remain unused due to power shortages [20][22]. - The stock prices of AI companies are closely tied to their GPU availability, and any delays in utilizing these chips could negatively impact their market valuations [22][24]. Group 4: Strategies for Energy Supply - Companies are exploring various strategies to secure energy, including building new power plants and relocating data centers to countries with more favorable energy conditions, although this presents its own set of challenges [25][27]. - Some companies are even considering space-based data centers powered by solar energy, although this concept is still in experimental stages and poses numerous technical challenges [28][31]. Group 5: Comparison with China - In contrast to the U.S., China's data center electricity consumption is significantly lower at 166 TWh, representing about 2% of total social electricity use, while also focusing on green energy initiatives [33][34]. - The emphasis on sustainable energy practices in China suggests a more stable environment for AI development compared to the energy crisis faced in the U.S. [34][36].
买得到芯片的美国科技巨头,买不到电了
3 6 Ke· 2025-11-11 04:31
Core Insights - OpenAI has been aggressively investing in AI infrastructure, including a $300 billion partnership with Oracle for data centers and a $100 billion chip purchase from NVIDIA, amidst a growing AI bubble driven by GPU sales [1][3] - Microsoft CEO Satya Nadella highlighted a critical issue: the lack of electricity is hindering AI development, despite the abundance of chips [3][5] Energy Consumption and Efficiency - In 2023, U.S. data centers consumed 176 terawatt-hours (TWh) of electricity, accounting for 4.4% of the national total, with projections to double by 2028 [5][8] - The average Power Usage Effectiveness (PUE) globally in 2024 is 1.56, indicating that only two-thirds of electricity is used for GPU computing, while one-third is wasted on cooling, power systems, and lighting [7][8] Challenges in Power Supply - The aging U.S. power grid is struggling to meet demand, leading to increased electricity costs for consumers, which has risen significantly from 2021 to 2022 [8][10] - The shift in energy policy under the Trump administration, including cuts to renewable energy projects, has exacerbated the situation, making it difficult for tech companies to secure sufficient power for their operations [10][12] Chip Lifecycle and Market Dynamics - Current AI chips like the H100 and A100, released in 2022, may soon be outdated as newer models (H200, B200, B300) are set to dominate the market by 2025, potentially rendering existing inventory obsolete [12][14] - The valuation of AI companies is closely tied to GPU availability and demand, meaning that unutilized chips could negatively impact stock prices [14][16] Strategies for Mitigation - Companies are exploring options to build new power plants, such as OpenAI and Oracle's joint natural gas facility in Texas, but face challenges including supply shortages for necessary equipment [16][18] - Some firms are considering relocating data centers to countries with less developed power infrastructure, which could further strain local resources [18][19] Global Comparison - In contrast to the U.S., China's data centers consumed 166 TWh in 2024, representing about 2% of total electricity usage, with a focus on green energy and carbon reduction [22][24] - The future of high-tech companies may hinge less on chip quantity and more on their ability to secure reliable electricity supply for their operations [24]
世界进入新瓦特时代
虎嗅APP· 2025-11-07 10:16
以下文章来源于朋克周 ,作者朋克周 朋克周 . 神行神的意志,我布我的大局 本文来自微信公众号: 朋克周 ,作者:朋克周,题图来自:AI生成 AI的尽头不是智能,是物理。是热力学,是焦耳定律,是变压器容量。当奥特曼公开承认AGI最大的 瓶颈是"能源"时,当马斯克将算力竞争比作"军备竞赛"并指出"电力是明年的短缺项"时,第一重叙事 中那个由纯粹信息构成的"无限世界"开始崩塌了。 我们必须清醒地认识到:"算力"正在从一种"资产"迅速异化为一种"负债"。 每一块被点亮的H100,都是一个永不餍足的"电能吞噬单元"。当万亿参数的模型在云端"思考"时, 物理世界的"熔炉"正在熊熊燃烧。 这场游戏的真正入场券,从来不是你拥有多少GPU。而是,你能为这些GPU"点亮"多少"瓦特"? 欢迎来到"新瓦特时代"。在这个时代,科技巨头、算力主权、乃至国家意志,都将跪倒在一个更古老 的权力面前。一个"旧世界"的幽灵正在回归,它将成为智能时代最终的仲裁者。 我们称之为:"瓦特央行"。 "数字熔炉"的诞生 我们必须重构对"数据中心"的认知。 我们正处在一个被双重叙事撕裂的时代。 第一重叙事,是关于"智能"的无限飞升。我们欢呼GPT-5的 ...