Workflow
Cervell™
icon
Search documents
RISC-V,革命NPU
半导体芯闻· 2025-05-27 10:21
Core Viewpoint - Semidynamics has introduced Cervell™, a fully programmable Neural Processing Unit (NPU) designed to handle scalable AI computing from edge to data center, marking a fundamental shift in AI processor design and deployment [1][3]. Group 1: Cervell Architecture - Cervell represents the culmination of Semidynamics' evolution from modular IP components to a tightly integrated unified architecture rooted in the open RISC-V ecosystem [1]. - The architecture of Cervell allows for seamless data flow between control logic, vector processing, and matrix operations without the need for DMA transfers or synchronization barriers, enhancing performance and efficiency [6][8]. - Cervell supports up to 256 TOPS of computational power at maximum configuration, achieving data center-level inference performance while maintaining flexibility for low-power edge deployments [6]. Group 2: Integration and Flexibility - Cervell integrates CPU, vector units, and tensor engines into a single processing entity, eliminating the need for external CPUs and reducing performance bottlenecks associated with traditional architectures [6][9]. - The design of Cervell challenges the traditional NPU model, which often relies on closed, fixed-function pipelines, by allowing enterprises to customize the architecture based on their algorithms [8]. - The open RISC-V instruction set architecture (ISA) enables deep customization and compatibility with open software ecosystems, allowing Cervell's capabilities to evolve with customer needs [8][9]. Group 3: Market Demand and Trends - As AI workloads grow in size and complexity, there is an increasing demand for a more unified computing platform, moving away from fragmented architectures that struggle with memory bottlenecks and data transfer delays [2]. - The shift towards programmable and flexible solutions is driven by customer preferences, as fixed-function NPUs are no longer sufficient to meet evolving requirements [3].