Workflow
Cerebras Systems
icon
Search documents
Vista Equity Partners and Intel to lead investment in AI chip startup SambaNova, sources say
Yahoo Finance· 2026-02-06 21:03
Core Viewpoint - Vista Equity Partners is leading a $350 million funding round in AI chip startup SambaNova Systems, marking a shift from its traditional focus on enterprise software [1][4]. Group 1: Investment Details - Vista, in partnership with Cambium Capital, is participating in the Series E funding round for SambaNova, with Intel Corp also investing approximately $100 million, potentially increasing to $150 million [2][3]. - The funding aims to help SambaNova compete with Nvidia Corp in the growing demand for inference chips used in AI applications [3]. Group 2: Market Context - The investment comes amid a selloff in global software stocks, which have lost nearly $1 trillion in value as AI transitions from a supportive factor to a potential disruptor [5]. - Interest in AI hardware has surged, driven by increased dealmaking among companies seeking efficient chips for AI applications [5]. Group 3: Competitive Landscape - Other AI chipmakers, such as Cerebras Systems, have also raised significant funding, with Cerebras securing $1 billion at a valuation of $23 billion [6]. - OpenAI is exploring partnerships with companies like Groq and Cerebras to find alternatives to Nvidia GPUs for their computing needs [7].
1596亿,AI芯片超级独角兽诞生
3 6 Ke· 2026-02-05 05:15
芯东西2月5日报道,今日,美国AI芯片独角兽Cerebras Systems宣布完成10亿美元(约合人民币69亿元)F轮融资,估值达到230亿美元(约合 人民币1596亿元)。 本轮融资由Tiger Global领投,Benchmark、Fidelity Management & Research Company、Atreides Management、Alpha Wave Global、Altimeter、 AMD、Coatue以及1789 Capital(合伙人包括小唐纳德·特朗普)等机构跟投。 成立于2015年的Cerebras,以餐盘大小的AI芯片而闻名。其晶圆级引擎3(WSE-3)芯片是全球最大、速度最快的AI芯片,体积是当前最大 GPU的56倍,单位计算功耗却远低于同类产品,同时推理和训练速度比竞品快20倍以上。 其芯片用于处理AI推理所需的顺序执行、内存密集型工作负载。与需要在芯片和内存之间来回传输数据的GPU不同,WSE将所有运算都保存 在芯片内部,从而消除了限制GPU推理能力的内存带宽瓶颈。 Cerebras上一轮融资是在2025年9月宣布完成的11亿美元(约合人民币76亿元)G轮融资,当时 ...
芯片制造商Cerebras Systems完成H轮融资,筹集10亿美元,估值约230亿美元
Jin Rong Jie· 2026-02-04 16:13
芯片制造商Cerebras Systems完成H轮融资,筹集10亿美元,估值约230亿美元。 ...
AI chip maker Cerebras Systems raises $1 billion in late-stage funding
Reuters· 2026-02-04 16:08
AI chip maker Cerebras Systems said on Wednesday it raised $1 billion in a late-stage funding round that valued it at $23 billion. ...
Cerebras Systems Raises $1 Billion Series H
Businesswire· 2026-02-04 16:00
Core Insights - Cerebras Systems has successfully closed a $1 billion Series H financing round, achieving a post-money valuation of approximately $23 billion [1] Group 1: Financing Details - The Series H financing round was led by Tiger Global, with participation from notable investors including Benchmark, Fidelity Management & Research Company, Atreides Management, Alpha Wave Global, Altimeter, AMD, Coatue, and 1789 Capital [1]
2025年四季度企业SaaS公共报表和估值指南(英)
PitchBook· 2026-02-03 02:00
Investment Rating - The report does not explicitly provide an investment rating for the industry but indicates a cautious outlook for enterprise SaaS multiples into 2026 due to global uncertainty and technological disruptions [6]. Core Insights - The median EV/TTM revenue multiple for public enterprise SaaS companies decreased to 5x at the end of Q4 2025, down from 5.3x in Q3 2025, and is expected to see limited upside into 2026 [6]. - Revenue growth rates for 2026 are anticipated to step down to high single digits or low double digits, with significant declines expected in several segments, while slight growth is expected in collaboration, productivity, and creative segments [9]. - The median gross margin for public enterprise SaaS companies increased to nearly 77% in 2025, with expectations of continued strength but limited substantial growth in 2026 [10]. - The median EBITDA margin rose to 19.8% in 2025, with expectations for further strengthening across most segments into 2026 [11]. Summary by Sections Revenue - Revenue growth rates for enterprise SaaS companies are projected to decline significantly in 2026, with the median growth rate barely in double digits, down from previous years' rates of 15% to 30% [9]. - The report highlights specific segments expected to experience declines, including CRM, sales, marketing & CX, finance, ERP, HR & payroll, and data, analytics & AI platforms [9]. Valuation - The report notes that valuation multiples have continued to decline, with 76 out of 102 tracked companies experiencing decreases in their EV/TTM revenue multiples from year-end 2024 to year-end 2025 [12]. - Notable companies that outperformed the broader SaaS decline include Unity, On24, and CS Disco, while companies like Ibotta and The Trade Desk saw significant decreases in their multiples [12]. Gross Margin and EBITDA - The median gross margin across public enterprise SaaS companies is projected to remain strong at 77% in 2026, with some segments like DevOps and vertical SaaS expected to see slight growth [10]. - The report anticipates that EBITDA margins will continue to improve, with the highest growth expected in data, analytics & AI platforms and collaboration, productivity & creative segments [11].
大芯片,再度崛起?
智通财经网· 2026-01-25 06:24
Core Insights - In early 2025, significant developments in the AI chip sector were reported, including Elon Musk's confirmation of Tesla's (TSLA.US) revival of the Dojo 3 supercomputer project, aiming to become the largest AI chip manufacturer globally, and Cerebras Systems' multi-year procurement agreement with OpenAI worth over $10 billion, promising 750 megawatts of computing power by 2028 [1][2]. Group 1: AI Chip Evolution - The evolution of AI chips is characterized by two distinct designs: Cerebras' wafer-scale integration and Tesla's Dojo, which represents a hybrid approach between single-chip and GPU clusters [3]. - The divergence stems from different solutions to the "memory wall" and "interconnect bottleneck" challenges, with traditional GPU architectures facing limitations in memory bandwidth compared to computational power [3][4]. Group 2: Cerebras' Innovations - Cerebras' WSE-3 chip features 40 trillion transistors, 900,000 AI cores, and 44GB of on-chip SRAM, achieving a bandwidth of 214 Pb/s, significantly outperforming NVIDIA's H100 [4]. - The design addresses yield issues associated with large wafers by minimizing the size of each AI core and employing redundancy to maintain performance despite defects [4]. Group 3: Tesla's Strategic Shift - Tesla's Dojo project faced setbacks but was revived with a new focus on "space AI computing," moving away from its original goal of competing with NVIDIA's GPU clusters [7][8]. - The AI5 chip, designed with a 3nm process, is expected to be produced by the end of 2026, aiming for performance comparable to NVIDIA's Hopper architecture [8]. Group 4: Market Dynamics and Competition - The AI chip market is becoming increasingly crowded, with competitors like AMD and NVIDIA rapidly advancing their offerings, which poses challenges for alternative architectures like wafer-scale systems [16][19]. - Cerebras aims to differentiate itself by focusing on low-latency inference systems, capitalizing on the growing demand for real-time AI applications [16][14]. Group 5: Strategic Partnerships - Cerebras' partnership with OpenAI, involving a $10 billion commitment for computing power, highlights the increasing importance of low-latency inference capabilities in the AI landscape [11][12]. - The collaboration reflects a broader trend of established tech companies integrating promising AI chip startups into their ecosystems, which may reshape the competitive landscape [20][21].
大芯片,再度崛起?
半导体行业观察· 2026-01-25 03:52
Core Insights - The article discusses significant developments in the AI chip sector, highlighting Tesla's revival of the Dojo 3 supercomputer project and Cerebras Systems' multi-billion dollar agreement with OpenAI for AI computing power [1][10]. Group 1: AI Chip Developments - Tesla's Dojo 3 project aims to position the company as a leading AI chip manufacturer, with a focus on "space artificial intelligence computing" rather than traditional training models [6][8]. - Cerebras Systems has secured a contract with OpenAI worth over $10 billion, promising to deliver 750 megawatts of computing power by 2028, emphasizing the growing demand for low-latency inference capabilities [10][11]. Group 2: Chip Architecture and Performance - The distinction between two types of large chips is made: Cerebras' wafer-scale integration and Tesla's wafer-scale system, each addressing the "memory wall" and "interconnect bottleneck" challenges differently [2][4]. - Cerebras' WSE-3 chip boasts 40 trillion transistors and 900,000 AI cores, achieving a memory bandwidth of 21 PB/s, significantly outperforming NVIDIA's H100 [3][11]. Group 3: Strategic Shifts - Tesla's shift in strategy reflects a recalibration of resources, moving away from competing directly with NVIDIA's GPU clusters to focusing on specialized applications in space computing [7][8]. - Cerebras' approach to positioning itself as a provider of dedicated inference machines allows it to capitalize on the emerging demand for low-latency processing, differentiating itself from traditional training platforms [15][19]. Group 4: Market Dynamics and Competition - The AI chip market is becoming increasingly crowded, with competitors like AMD and NVIDIA rapidly advancing their offerings, which poses challenges for alternative architectures like those from Cerebras and Tesla [15][19]. - The collaboration between OpenAI and Cerebras is seen as a strategic move to secure a foothold in the burgeoning inference market, which is expected to dominate AI computing needs in the future [10][19]. Group 5: Future Outlook - The advancements in packaging technology, such as TSMC's CoWoS, are expected to blur the lines between large and small chip architectures, potentially reshaping the competitive landscape [16][19]. - The article concludes that both Tesla and Cerebras are not merely trying to replicate NVIDIA's success but are instead seeking to find value in niches overlooked by general solutions, indicating a long-term battle for survival and innovation in the AI chip market [20].
哈佛辍学“三剑客”,做AI芯片,刚刚融了35亿
创业邦· 2026-01-24 04:10
专用芯片正在崛起。 作者丨漫地 编辑丨 关雎 三位 从哈佛辍学的 00 后,最近刚为自己的人工智能芯片初创公司 Etched.ai 融了 5 亿美元。 这是人工智能硬件领域规模最大的融资之一,此轮融资使 Etched.ai 的估值接近 50 亿美元,总融资额也接近 10 亿美元。 Etched.ai 的创始人 Gavin Uberti ,今年才 24 岁。 他和另两位创始人 Chris Zhu 、 Robert Wachen 一同 从哈佛辍学后,致力于领导公司打造下一代 人工智能芯片,与芯片巨头英伟达不同的是,他 们 闯出了一条细分赛道 —— 做专用于当 前 AI 主流模型 Transformer 架构 的 ASIC 芯片,从而超越通 用 GPU 芯片。 ASIC 是为了某种特定的用途而定制设计的芯片,而不是像 CPU (中央处理器)或 GPU (图形处理器)那样可以运行各种不同类型的程序。 算力市场的逻辑正在生变。 Etched.ai 何以能挑战英伟达? 从哈佛辍学的创业者 Etched.ai 的成立,要从一位哈佛大学的辍学生 Gavin Uberti 说起。 在创立 Etched.ai 之前, Gavin ...
200亿美元!OpenAI 年化收入两年增长十倍
Xin Lang Cai Jing· 2026-01-19 13:16
Core Insights - OpenAI's annual revenue is projected to exceed $20 billion by 2025, a significant increase from approximately $2 billion in 2023, driven by rapid expansion in computing power [1][4][6] - The company's business model is designed to grow in sync with the actual value created by its AI systems in real-world applications, directly linking revenue performance to user engagement with its technology [1][4][6] Computing Power Expansion - OpenAI's computing capacity has tripled over the past year, reaching approximately 1.9 gigawatts in 2025, compared to about 0.2 gigawatts in 2023 [3][6] - The increase in computing power has directly contributed to the company's revenue growth, with a potential for faster user adoption and commercialization if more resources had been available earlier [3][6] User Engagement and Business Strategy - OpenAI has reported record highs in daily and weekly active users, attributing this growth to the transition of ChatGPT from a consumer-level product to a foundational tool integrated into personal and professional workflows across various sectors, including education, writing, software development, marketing, and finance [3][6] - The company has evolved its business strategy from an initial consumer subscription model to include team and enterprise versions, as well as a usage-based pricing model through its API platform, aligning costs with actual workloads [3][6] Future Revenue Models - Looking ahead to 2026, OpenAI plans to focus its financial efforts on quantifiable real-world applications, particularly in healthcare, research, and enterprise scenarios [4][7] - The company is exploring revenue models beyond subscriptions and APIs, including licensing, intellectual property agreements, and outcome-based pricing to support further applications of AI in drug development, energy systems, and financial modeling [4][7]