算力浪潮
Search documents
杨元庆:新一轮算力浪潮将源于AI推理的爆发|直击CES
Xin Lang Cai Jing· 2026-01-07 02:35
事实上,全球AI行业对于未来全球算力基础设施的竞争焦点正在从"训练"转移至"推理"这一趋势已形成 广泛共识。 为应对当前AI推理面临的内存、延迟、安全及能耗等挑战,联想集团最新发布了业界最全面的推理优 化服务器产品组合,包括AI推理服务器SR675i、SR650i和边缘计算服务器SE455i,旨在将AI模型带到 本地和边缘等离数据源头更近的地方,大幅提升推理效率,降低运营成本,并强化数据安全性,以适应 企业多样化、实时化的AI部署需求。 新浪声明:所有会议实录均为现场速记整理,未经演讲者审阅,新浪网登载此文出于传递更多信息之目 的,并不意味着赞同其观点或证实其描述。 责任编辑:王翔 专题:联想创新科技大会 新浪科技讯 北京时间2026年1月7日,联想集团董事长兼CEO杨元庆在全球创新科技大会(CES)主旨 演讲环节上表示, 新一轮的算力浪潮将源于AI推理的爆发。 杨元庆认为,全球算力基础设施市场经历了四波创新浪潮的洗礼:第一波是依托传统计算的企业信息化 与数字化转型;第二波由云服务、云应用驱动,推动云计算快速兴起;第三波则是大语言模型训练催生 的大规模算力集群,这一阶段的AI训练主要集中在云端。"如今,我们正 ...
英维克(002837):液冷驱动25H1业绩高增,算力时代机遇铸就增长引擎
Guotou Securities· 2025-08-21 00:59
Investment Rating - The report maintains a "Buy-A" investment rating for the company, with a target price of 81.88 CNY per share [4][7]. Core Views - The company achieved significant revenue growth in H1 2025, with total revenue reaching 2.573 billion CNY, a year-on-year increase of 50.25%. The net profit attributable to shareholders was 216 million CNY, up 17.54% year-on-year [1]. - The growth in revenue is primarily driven by increased demand for cooling solutions in data centers, with liquid cooling revenue exceeding 200 million CNY in H1 2025 [2][3]. - The company has established a comprehensive product layout for liquid cooling solutions and is positioned to benefit from the growing demand for computing power, particularly through partnerships with major clients like Nvidia, ByteDance, Tencent, and Alibaba [3]. Financial Summary - The company forecasts revenues of 6.104 billion CNY, 7.910 billion CNY, and 9.921 billion CNY for 2025, 2026, and 2027, respectively. Net profits are expected to be 611 million CNY, 781 million CNY, and 1.019 billion CNY for the same years [4][11]. - The overall gross margin for H1 2025 was 26.15%, a decrease of 4.84 percentage points year-on-year, attributed to regional sales mix and increased competition in the Chinese market [2]. - The company’s net profit margin for H1 2025 was 8.78%, down 1.92 percentage points year-on-year [2].