Workflow
PCIe Gen5 mSSD
icon
Search documents
端侧AI时代,存储变了:江波龙全面出击
半导体行业观察· 2026-03-31 02:23
Core Viewpoint - The article emphasizes that artificial intelligence (AI) is becoming a reality, with significant investments in infrastructure and models, and anticipates that 2026 will mark a year of large-scale AI deployment, particularly in edge applications [1][2]. Group 1: AI Storage Needs - The core tasks of storage systems in AI training focus on handling massive data throughput and high-frequency checkpoint writes to prevent I/O bottlenecks, which has led to a demand for high bandwidth memory (HBM) and large SSDs [3]. - In edge AI applications, the focus shifts to inference, requiring innovations in power consumption, performance, and size due to the close integration with application scenarios [3][4]. - The need for efficiency in both training and inference is highlighted, with a call for a layered approach to storage processing to address high costs and token expenses in edge AI [3][4]. Group 2: Customized Storage Solutions - Edge AI requires deeply integrated, customized storage solutions rather than generic products, focusing on high-performance capacity and system-level integration [5][7]. - The company has developed a comprehensive capability across the entire supply chain, including chip design and manufacturing, to provide tailored storage services for edge AI applications [7][8]. Group 3: Product Innovations - The company showcased its new PCIe Gen5 mSSD, designed for edge AI devices, featuring a compact size and high performance, with read/write speeds reaching up to 11GB/s and 10GB/s, respectively [9][10]. - The mSSD's innovative cooling solution allows for sustained high performance, significantly improving thermal management compared to conventional SSDs [13][14]. Group 4: Intelligent Storage Solutions - The introduction of the Storage Processing Unit (SPU) and the Intelligence Storage Agent (iSA) creates a synergistic hardware-software ecosystem for edge AI storage, enhancing storage scheduling efficiency [16][19]. - The SPU, designed specifically for AI applications, balances capacity and cost, offering significant advantages over traditional storage solutions [16][19]. Group 5: Advanced Caching Technology - The High Level Cache (HLC) technology integrates with SPU to optimize performance and cost in edge AI devices, allowing for efficient data management and reduced DRAM requirements [21][22]. - The HLC technology has demonstrated significant performance improvements in real-world applications, achieving response times comparable to higher-capacity configurations [22]. Group 6: System in Package (SiP) Technology - The company has developed a complete SiP design process, enabling the integration of multiple chips into a single package, which is crucial for compact edge AI devices [25][26]. - This technology not only reduces hardware size but also enhances thermal management and structural layout, making it a competitive solution for various edge AI applications [26]. Conclusion - The advancements in storage technology and the strategic focus on edge AI applications position the company as a leader in the evolving landscape of AI, emphasizing the importance of innovation and collaboration in driving industry growth [28].