Workflow
AI Storage
icon
Search documents
解读AI存算加速系统大机遇
2026-03-26 13:20
解读 AI 存算加速系统大机遇 20260324 摘要 AI 存储已成为算力瓶颈,智算中心存储投资占比从 1%提升至 10%- 15%,中国市场 CAGR 超 40%。 风行智远定位 2.5 代智能存储,通过存算直通技术绕过 CPU,使数据吞 吐提升 2-4 倍,能耗降低 30%-40%。 硬件层面推行"以存代算",利用智能硬盘处理 KVCache 等中间数据, 访存成本可降至原方案的 1/50。 针对 DeepSeek 等 MoE 模型,该方案可将推理总成本降低约 30%,显 著减少对昂贵 HBM 和高算力 CPU 的依赖。 训练场景下,智能存储支持故障后增量更新,无需频繁写入全量 Checkpoint,可节省约 11%的训练总成本。 对标海外估值 300 亿美元的 VAST Data,公司已导入三大运营商及国 产 GPU 头部客户,切入存量中心改造与新建市场。 Q&A 在当前 AI 大模型时代,存储系统面临哪些瓶颈,以及由此带来了怎样的市场机 遇? 请介绍一下风行智远的公司定位、核心产品布局以及团队背景? 风行智远致力于成为国产 AI 算力与加速应用的领航者。当前大模型时代,存储 系统已成为关键瓶颈,公司 ...
AI存储解决方案巨头冲击IPO,估值45.8亿,腾讯押注,来自北京
格隆汇APP· 2026-01-29 10:08
Core Viewpoint - The article discusses the IPO of a leading AI storage solutions company, which is valued at 4.58 billion, with Tencent making a significant investment in the company [1]. Group 1 - The AI storage solutions company is preparing for its IPO, indicating strong market interest and potential for growth in the AI sector [1]. - The company's valuation of 4.58 billion reflects its position as a major player in the AI storage market, highlighting investor confidence [1]. - Tencent's investment signifies strategic backing from a major tech player, which may enhance the company's market credibility and operational capabilities [1].
星辰天合冲击IPO,专注于AI存储解决方案领域,连续两年亏损
Ge Long Hui· 2026-01-29 10:00
Group 1: Market Overview - The recent price increase of storage chips has become a focal point in the global market, with Samsung Electronics raising NAND flash supply prices by over 100% in Q1 2026, followed by SK Hynix and others [1] - Several domestic companies in the storage chip sector, such as Changqi Technology and Zhaoyi Innovation, have reported positive earnings forecasts for 2025, with Changqi Technology expecting a net profit increase of 52% to 66% year-on-year [1] Group 2: Company Profile - Beijing Starry Sky Technology Co., Ltd. (Starry Sky) was founded in May 2015 and has undergone eight rounds of financing, with notable investors including Tencent and Northern Light Venture Capital [3][4] - The company focuses on enterprise-level AI storage solutions, providing AI data lake storage and AI training and inference storage solutions [6][8] Group 3: Financial Performance - Starry Sky's revenue has shown growth, with figures of RMB 166.84 million in 2023, RMB 172.48 million in 2024, and RMB 194.86 million in the first nine months of 2025, while net losses decreased from RMB 180.67 million in 2023 to a profit of RMB 0.81 million in 2025 [11][12] - The company's gross margin improved from 55.4% in 2023 to 63.7% in 2024 and remained at 63.7% in the first nine months of 2025 [11] Group 4: Market Position and Competition - Starry Sky holds approximately 10.4% market share among distributed AI storage solution providers, making it the second-largest in China and the largest independent provider [22][31] - The AI storage infrastructure market is expected to grow significantly, with local deployment storage capacity projected to reach 13.4EB by 2024 and 67.2EB by 2030, with a compound annual growth rate of 30.9% [23][26] Group 5: Product Offerings - Starry Sky's AI storage solutions are delivered in two forms: integrated machines and pure software, tailored to customer needs [9] - The company has seen an increase in the proportion of revenue from AI data lake storage solutions, rising from 37.2% in 2023 to 46.1% in 2025 [13][14]
【研报行业】液冷千亿市场蓄势待发,国产链加速入局,谁能抢占英伟达生态新红利?关注这些全链条布局厂商
第一财经· 2025-12-08 11:47
Group 1 - The core viewpoint of the article emphasizes the importance of timely and relevant research reports in identifying investment opportunities, particularly in emerging markets like liquid cooling and AI-driven storage solutions [1] - The liquid cooling market is projected to reach a trillion yuan, with domestic companies accelerating their entry into the market, highlighting the competitive landscape and potential beneficiaries within the NVIDIA ecosystem [1] - AI is driving a new cycle in storage, with HBM (High Bandwidth Memory) expected to grow fivefold over six years, indicating a significant growth opportunity in the equipment sector for key players [1]
广发证券:推理驱动AI存储快速增长 建议关注产业链核心受益标的
智通财经网· 2025-09-23 08:56
Core Insights - The rapid growth of AI inference applications is significantly increasing the reliance on high-performance memory and tiered storage, with HBM, DRAM, SSD, and HDD playing critical roles in long-context and multimodal inference scenarios [1][2][3] - The overall demand for storage is expected to surge to hundreds of exabytes (EB) as lightweight model deployment drives storage capacity needs [1][3] Group 1: Storage in AI Servers - Storage in AI servers primarily includes HBM, DRAM, and SSD, characterized by decreasing performance, increasing capacity, and decreasing costs [1] - Frequently accessed or mutable data is retained in higher storage tiers, such as CPU/GPU caches, HBM, and dynamic RAM, while infrequently accessed or long-term data is moved to lower storage tiers like SSD and HDD [1] Group 2: Tiered Storage for Efficient Computing - HBM is integrated within GPUs to provide high-bandwidth temporary buffering for weights and activation values, supporting parallel computing and low-latency inference [2] - DRAM serves as system memory, storing intermediate data, batch processing queues, and model I/O, facilitating efficient data transfer between CPU and GPU [2] - Local SSDs are used for real-time loading of model parameters and data, meeting high-frequency read/write needs, while HDDs offer economical large capacity for raw data and historical checkpoints [2] Group 3: Growth Driven by Inference Needs - Memory benefits from long-context and multimodal inference demands, where high bandwidth and large capacity memory reduce access latency and enhance parallel efficiency [3] - For example, the Mooncake project achieved computational efficiency leaps through resource reconstruction, and various upgrades in hardware support high-performance inference in complex models [3] - Based on key assumptions, the storage capacity required for ten Google-level inference applications by 2026 is estimated to be 49EB [3]