Core Viewpoint - The article discusses the challenges faced by traditional IP providers in the NPU (Neural Processing Unit) market, emphasizing that their reliance on legacy architectures limits their ability to innovate and adapt to new AI workloads [1][5][6]. Group 1: NPU Market Dynamics - The NPU market is rapidly evolving, with both traditional and emerging companies competing to offer integrated solutions that combine matrix computation with general-purpose computing [1]. - Many leading IP companies have adopted similar strategies, slightly modifying traditional instruction sets and providing matrix accelerators for common machine learning benchmarks [2][4]. Group 2: Limitations of Current Architectures - Current architectures require algorithm partitioning to run on dual engines, which works well for a limited number of algorithms but struggles with newer models like Transformers that demand a broader range of graph operators [4][6]. - The reliance on fixed-function accelerators has led to obsolescence as new AI models emerge, forcing companies to reconsider their approach to NPU design [5][6]. Group 3: Strategic Missteps - Traditional IP companies opted for short-term solutions by integrating matrix accelerators with existing processors, which has resulted in a technological trap as they face increasing demands for flexibility and performance [5][6]. - The "innovator's dilemma" is highlighted, where companies must balance the need for new architectures against the risk of undermining their existing successful products [6].
传统NPU供应商,碰壁了!
半导体行业观察·2025-06-12 00:42