Cerebras
Search documents
英伟达黄仁勋否认与OpenAI交易生变传闻:一切按计划推进
Hua Er Jie Jian Wen· 2026-02-03 20:43
英伟达CEO黄仁勋否认与OpenAI关系紧张传闻,确认将参与OpenAI下轮融资并称其为"史上最大规模私募融资"。 2月3日周二,黄仁勋在接受CNBC Jim Cramer采访时明确表示公司投资OpenAI的计划仍在"按计划推进",否认了近期有关 双方关系紧张的市场传闻。他强调: 完全没有争议,这些说法纯属无稽之谈......没有任何戏剧性,一切都在按计划推进......我们非常乐意与OpenAI合 作。 黄仁勋进一步确认,英伟达将参与OpenAI下一轮融资,他称这将是"史上最大规模的私募融资"。周一Altman在社交平台上 回应了有关公司与英伟达关系的议论: 去年9月,黄仁勋与OpenAI首席执行官Sam Altman共同宣布了一份意向书,英伟达计划分批向这家AI实验室投资最高1000亿 美元。根据协议,OpenAI将基于英伟达技术构建AI基础设施,电力需求可达10吉瓦。 但近几个月,Altman表示OpenAI没有足够的芯片来满足ChatGPT等产品的需求,若能获得更多算力可产生更多收入。 OpenAI与英伟达的竞争对手达成芯片交易,包括AMD、博通和Cerebras。这一系列动作引发了市场对双方关系的猜 ...
Nvidia's Jensen Huang denies OpenAI deal rumors: 'There's no drama'
CNBC· 2026-02-03 19:30
Core Viewpoint - Nvidia's CEO Jensen Huang confirmed that the company's investment plan in OpenAI is still on track despite recent tensions and reports suggesting the deal is "on ice" [1][2]. Group 1: Investment Plans - Nvidia plans to invest up to $100 billion in OpenAI in tranches, with the investment aimed at building AI infrastructure that requires up to 10 gigawatts of power [1]. - Huang stated that Nvidia will participate in OpenAI's next fundraising round, which is expected to be the largest private round ever raised, potentially reaching $100 billion [3]. - Nvidia is open to investing in any future fundraising rounds for OpenAI and aims to participate in an eventual IPO [3]. Group 2: Relationship Dynamics - OpenAI has historically relied on Nvidia's graphics processing units for its AI models but has recently faced chip shortages, prompting it to seek deals with competitors like AMD, Broadcom, and Cerebras [4]. - OpenAI's CEO Sam Altman expressed a positive sentiment towards Nvidia, emphasizing their strong working relationship and the quality of Nvidia's AI chips [5]. Group 3: Market Reaction - Following the uncertainty surrounding the investment deal, Nvidia's shares fell by more than 3.4%, contributing to a broader decline in tech stocks, and are currently 13% below their October peak [2].
OpenAI被曝“嫌弃”英伟达(NVDA.US)AI芯片 奥尔特曼亲自回应:疯狂说法毫无依据
智通财经网· 2026-02-03 12:24
智通财经APP获悉,据知情人士透露,OpenAI对英伟达(NVDA.US)旗下多款最新AI芯片的表现感到不 满,且自去年起便开始寻求替代方案。 报道指出,OpenAI的这一战略调整,核心原因是公司对AI推理环节专用芯片的重视程度持续提升。目 前英伟达在大模型训练芯片领域仍占据绝对主导地位,但AI推理芯片已成为行业竞争的全新主战场。 AI推理,指的是经训练后的AI模型,运用习得的知识分析全新的未标注数据,进而完成预测、决策或 生成输出结果的过程。 针对相关报道,OpenAI首席执行官奥尔特曼在X平台发文回应称:"我们十分乐意与英伟达合作,他们 打造了全球最顶尖的AI芯片。我们希望在未来很长一段时间里,都能成为英伟达的核心大客户。我实 在无法理解这些疯狂的说法是从哪里来的。" 英伟达发言人也在一份邮件声明中表示:"客户始终选择英伟达的推理芯片,因为我们能在规模化部署 中,提供最佳的性能表现和总拥有成本优势。" 报道强调,OpenAI等企业在推理芯片市场寻找替代方案的举动,是对英伟达AI芯片霸主地位的一次考 验,而这一情况发生时,双方正处于投资合作的洽谈阶段。 2025年9月,英伟达曾宣布计划向微软(MSFT.US ...
英伟达GPU,被嫌弃了
半导体行业观察· 2026-02-03 01:35
Core Viewpoint - OpenAI is dissatisfied with some of NVIDIA's latest AI chips and has been seeking alternatives since last year, indicating a potential shift in the relationship between these two prominent companies in the AI sector [2][3]. Group 1: OpenAI's Concerns - OpenAI's dissatisfaction stems from the performance of NVIDIA's hardware in providing timely responses for specific queries, particularly in software development and AI communication, which has led to a need for new hardware to meet approximately 10% of its future inference computing demands [3][8]. - OpenAI has explored partnerships with startups like Cerebras and Groq to obtain faster inference chips, but negotiations with Groq fell through due to NVIDIA's $20 billion licensing agreement with Groq [4][5]. Group 2: NVIDIA's Position - NVIDIA's CEO Jensen Huang has denied reports of a strained relationship with OpenAI, asserting that the company plans to invest up to $100 billion in OpenAI and that customers continue to choose NVIDIA for inference due to its performance and cost-effectiveness [3][5]. - NVIDIA has engaged with companies like Cerebras and Groq to explore potential acquisitions of SRAM chip technology, which is crucial for enhancing inference capabilities [10]. Group 3: Market Dynamics - The AI industry is witnessing a shift towards inference-focused chips, with OpenAI's efforts reflecting a broader trend where companies are prioritizing speed and efficiency in processing user requests [7][8]. - Competitors like Anthropic and Google benefit from using proprietary chips designed specifically for inference, which may provide them with performance advantages over NVIDIA's general-purpose AI chips [8].
股价突跌2.89%!路透:OpenAI对英伟达最新一些AI芯片不满意,寻求替代方案!英伟达AI主导地位迎重大考验!
美股IPO· 2026-02-02 23:15
OpenAI对英伟达最新的一些人工智能芯片并不满意,并且自去年以来一直在寻找替代方案,这可能会使这两家AI热潮中最受关注的公司之间的关系变得 更加复杂。 据媒体援引多位知情人士表示,OpenAI对英伟达最新的一些人工智能芯片并不满意,并且自去年以来一直在寻找替代方案,这可能会使这两家 AI热潮中最受关注的公司之间的关系变得更加复杂。 OpenAI这一战略转变,源于其对用于执行AI推理中特定环节芯片的重视程度不断提高。所谓推理,是指像支撑ChatGPT应用的AI模型在响应用 户问题和请求时所进行的计算过程。英伟达在训练大型AI模型所需的芯片领域仍占据主导地位,而推理正成为竞争的新战场。 分析称,OpenAI及其他公司决定在推理芯片市场寻找替代方案,标志着对英伟达AI主导地位的一次重大考验。 周一,英伟达收跌近2.9%。 当前,OpenAI和英伟达这两家公司仍在进行投资谈判: 去年9月,英伟达表示,计划向OpenAI投入高达1000亿美元,作为一项交易的一部分。该交易将使英伟达获得这家初创公司的股份,同时为OpenAI提 供购买先进芯片所需的资金。 在此期间,OpenAI已与AMD等公司达成协议,采购可与英伟达竞 ...
64笔超1亿美元融资,从这16家“新晋AI顶流”,看懂硅谷的新逻辑
3 6 Ke· 2026-01-28 12:52
Core Insights - In 2025, the focus of AI venture capital in the U.S. shifted from "spreading wide" to "placing big bets," with 64 deals exceeding $100 million, including 8 companies receiving multiple large investments, leading to rising valuations [1][4] Group 1: Investment Trends - The trend of headlining investments is evident, with 35 transactions exceeding $200 million covering 29 companies in 2025 [3] - Investment is primarily directed towards two main lines: restructuring the physical foundations of AI and targeting core business flows in high-value industries [4][54] - AI infrastructure investments accounted for 7 out of 18 major funding rounds, with significant amounts raised by companies like Cerebras and Unconventional AI [5] Group 2: Notable Companies and Their Innovations - Unconventional AI raised $475 million in seed funding, focusing on bio-inspired computing, aiming for a theoretical efficiency improvement of 1000 times over traditional GPUs [6][7] - Cerebras Systems secured $1.1 billion in G round funding, specializing in wafer-scale AI computing, with a product designed for accelerated training and low-latency inference [8][10] - Celestial AI completed a $250 million C round, developing photonic AI accelerators that significantly enhance efficiency compared to traditional GPUs [12] - Modular raised $250 million, focusing on unified AI computing infrastructure with a product that optimizes across various hardware [13][15] - Fireworks AI, an open-source large model cloud platform, raised $250 million, providing extensive AI infrastructure services [17][18] Group 3: AI Applications in Vertical Industries - Cognition AI raised $400 million, developing an AI engineer capable of independent software development, targeting tech companies and financial institutions [20][21] - Sierra, an AI-driven conversational platform, raised $350 million, focusing on automating customer interactions across various sectors [23][25] - Ambience Healthcare, specializing in clinical documentation automation, raised $243 million, aiming to reduce documentation time for healthcare providers [27][28] - OpenEvidence, an AI clinical decision support company, raised $200 million, providing real-time answers to clinical questions based on authoritative medical literature [30][32] - EliseAI, an automation platform for real estate and healthcare, raised $250 million, focusing on operational efficiencies in both sectors [35][36] Group 4: AI for Science - Lila Sciences, an AI-driven scientific platform, raised $350 million, integrating generative AI and automated laboratories for research [48][49] - Periodic Labs, focusing on materials science, raised $300 million, developing a triad science stack for accelerating material discovery [50] - SandboxAQ, a quantum and AI technology company, raised $450 million, providing solutions for post-quantum cryptography and AI-driven quantum simulations [51][53]
推理需求超越训练,这种芯片为何成为汽车智能化决胜关键?
Zhong Guo Qi Che Bao Wang· 2026-01-26 08:52
Core Insights - The integration of AI inference chips is becoming crucial for automotive intelligence as autonomous driving approaches [2][10] - The demand for inference chips is expected to significantly increase by 2026 due to the rapid growth of automotive intelligence needs [3] Inference Demand Surge - AI model training has been a key growth driver for the AI chip market, with high-end chips like NVIDIA's H100 and H200 being highly sought after, often resulting in multi-million dollar orders [4] - Inference chips have now surpassed training chips in demand, becoming the new mainstay for data center computing power and smart driving applications, as companies focus on translating large models into practical applications [4][5] Automotive Intelligence Key to Success - Autonomous vehicles are evolving into highly integrated "smart mobile terminals" that require real-time decision-making capabilities, supported by the powerful computing power of inference chips [6] - A Level 4 autonomous vehicle can generate data volumes of several gigabytes per second, necessitating rapid processing and analysis for effective driving decisions [6][7] Performance and Efficiency of Inference Chips - Inference chips are designed for edge computing, allowing for immediate data processing without relying on cloud transmission, which is critical for timely decision-making in autonomous driving [7] - New generation inference chips utilize advanced architectures and manufacturing processes, such as 7nm technology, to provide high performance while significantly reducing energy consumption [8] Customization for Autonomous Driving - Inference chips must be tailored for core tasks in autonomous driving, such as visual recognition and decision control, through customized neural network accelerators to enhance processing efficiency and accuracy [9] Industry Transformation with Inference Chips - Inference chips represent a pivotal point in AI industry development, acting as a bridge from research to market application and playing an essential role in automotive intelligence [10] - Achieving automotive-grade certification is a significant hurdle for inference chips, requiring rigorous environmental testing to ensure reliability and stability throughout the vehicle's lifecycle [10][11] Challenges and Future Outlook - Algorithm adaptation is a key challenge for inference chips in automotive applications, necessitating close collaboration between chip manufacturers and automotive companies to optimize performance [11] - The rise of inference chips marks a new phase in the AI and autonomous driving industry, addressing core issues such as cost, latency, and privacy, and enabling deeper integration of AI technologies into operational contexts [11][12] - As AI technology and automotive hardware converge, the future application prospects for inference chips will expand, with increasing competition among automotive companies to develop more competitive autonomous driving solutions [12]
What Bubble? Nvidia CEO Says AI Needs Trillions More in Investments
Yahoo Finance· 2026-01-21 22:57
Core Insights - The AI industry requires "trillions of dollars" in investment for infrastructure development to avoid failure, according to Nvidia's CEO Jensen Huang [1] - Huang describes AI as a "five-layer cake," emphasizing that each layer, from energy to applications, necessitates significant investment, with current commitments at around $1.5 trillion for 2025 alone [2] - Nvidia's market capitalization is now comparable to the total value of all mined silver, highlighting the financial impact of the AI boom [3] Investment and Market Dynamics - Huang's statements come amid market volatility, particularly after a Chinese startup's chatbot caused a 17% drop in Nvidia shares [4] - Despite substantial investments in generative AI, a study from MIT indicates that 95% of organizations are seeing no return on their investments, raising concerns about potential waste [5] - The financing structure within the AI sector has been criticized for creating a closed loop, where Nvidia's investment in OpenAI leads to increased demand for its chips [6] Competitive Landscape - Companies are taking measures to mitigate Nvidia's market dominance, with OpenAI signing a $10 billion deal with Cerebras for faster AI chip technology and partnerships with AMD and Broadcom [7] - Google is promoting its custom Tensor Processing Units (TPUs) as alternatives, with Anthropic agreeing to utilize up to one million TPUs, while Meta is also exploring Google's silicon for its data centers [8]
半导体大涨原因或已找到!硬科技宽基——双创龙头ETF(588330)盘中拉升3%,龙芯中科20CM涨停
Xin Lang Cai Jing· 2026-01-21 06:07
今日(1月21日)科创板大涨,创业板亦有亮眼表现,覆盖科创板+创业板高成长龙头的硬科技宽基 ——双创龙头ETF(588330)场内价格盘中涨超3.1%,现涨2.51%。 | 序号 | 名称 | 涨跌幅 ▼ | | 两日图 | 申万一级行业 | 申万二级行业 | 申万三级行业 | 总市值 | 成交额 | | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | | 1 | 龙芯中科 | 20.00% | | | 电子 | 半导体 | 数字芯片设计 | 713亿 | 21.53亿 | | 2 | 澜起科技 | 13.96% | | | 电子 | 半导体 | 数字芯片设计 | 1868亿 | 118.41亿 | | 3 | 海光信息 | 13.63% | | | 电子 | 未合体 | 数字芯片设计 | 6714Z | 153.75亿 | | 4 | 三环集团 | 6.32% | | | 电子 | 元件 | 被动元件 | 1022亿 | 11.95亿 | | 5 | 老鼠服务 | 5.71% | 1 | | 电子 | 非合体 | 数字芯片设计 | 10 ...
AI芯片、存储芯片等概念涨幅居前,人工智能ETF万家(159248)涨超2%,连续3天净流入
Xin Lang Cai Jing· 2026-01-21 02:28
2026年1月21日早盘,AI芯片、存储芯片等概念涨幅居前,截至10:04,中证人工智能主题指数(930713) 强势上涨2.07%,成分股澜起科技上涨10.26%,恒玄科技上涨5.29%,景嘉微上涨4.62%,芯原股份,中 科曙光等个股跟涨。人工智能ETF万家(159248)上涨2.22%。 人工智能ETF万家近3天获得连续资金净流入,最高单日获得649.16万元净流入,合计"吸金"1136.22万 元。 近期,AI基础设施建设加速推进,Meta正式成立"Meta Compute"新部门统筹全球数据中心群与供应商合 作,目标在本十年内建成数十吉瓦、最终达数百吉瓦级AI算力设施;与此同时,Cerebras斩获OpenAI超 100亿美元合作大单,承诺2028年前提供750兆瓦算力支持,凸显头部AI企业对异构算力供给的迫切需 求与长期投入决心。 政策面方面,工业互联网政策体系全面升级,"AI+制造"三大专项行动密集落地,涵盖《"人工智能+制 造"专项行动实施意见》《推动工业互联网平台高质量发展行动方案(2026—2028年)》《工业互联网 和人工智能融合赋能行动方案》,明确2027年推出1000个高水平工业智能 ...