Workflow
类脑计算
icon
Search documents
大脑一样低功耗、高并行、高效率!国际首台,研制成功
Guan Cha Zhe Wang· 2025-08-02 13:18
Core Insights - The article discusses the launch of the Darwin Monkey ("悟空"), a new generation neuromorphic brain-like computer developed by Zhejiang University, which features over 2 billion neurons, marking a significant advancement in the field of neuromorphic computing [3][4]. Group 1: Technology and Development - The Darwin Monkey consists of 15 blade-type neuromorphic servers, each integrating 64 Darwin 3rd generation brain-like computing chips, which were developed in collaboration with the ZheJiang Lab [4][7]. - The Darwin 3rd generation chip supports over 2.35 million spiking neurons and hundreds of millions of synapses, enabling advanced brain-like computing capabilities [4][7]. - The system operates at approximately 2000 watts under typical conditions, showcasing its low power consumption [7]. Group 2: Innovations and Breakthroughs - The research team achieved breakthroughs in several key technologies, including large-scale neuron system interconnection and integration architecture, adaptive time-step control methods, and a layered system resource management framework [10][15]. - A new generation Darwin brain-like operating system was developed, which employs a hierarchical resource management architecture to optimize system resources dynamically [10][15]. Group 3: Applications and Implications - The Darwin Monkey has successfully deployed various intelligent applications, including running the DeepSeek brain-like model for logical reasoning, content generation, and mathematical problem-solving [13][15]. - The system serves as a natural platform for brain simulation, aiding neuroscience research by providing new experimental tools to explore brain mechanisms [15]. - The capabilities of the Darwin Monkey are expected to address the high energy consumption and computational demands of current deep learning models, potentially revolutionizing artificial intelligence [15].
突破性进展!国际首台,研制成功
Huan Qiu Wang Zi Xun· 2025-08-02 12:11
Core Insights - The Zhejiang University Brain-Machine Intelligence National Key Laboratory has launched a new generation of neuromorphic brain-like computer named "Wukong" which features over 2 billion pulse neurons and more than 100 billion synapses, closely resembling the scale of a macaque brain [1][3] - "Wukong" operates at a power consumption of approximately 2000 watts under typical conditions, making it the first neuromorphic computer internationally to exceed 2 billion neurons based on dedicated neuromorphic chips [1][3] Group 1 - "Wukong" is equipped with 960 Darwin 3rd generation neuromorphic computing chips, organized into 15 blade-style neuromorphic servers [3][5] - Each Darwin chip supports over 2.35 million pulse neurons and hundreds of millions of synapses, along with a dedicated instruction set for brain-like computing and an online learning mechanism [3][5] Group 2 - The research team has developed a new generation of Darwin brain-like operating system and has validated multiple intelligent applications on "Wukong," capable of running the DeepSeek brain-like model for logical reasoning, content generation, and mathematical problem-solving [5] - "Wukong" can also simulate the brain structures of various animals, including Caenorhabditis elegans, zebrafish, mice, and macaques, showcasing its versatility in brain-like computations [5]
我国神经拟态类脑计算突破性进展
news flash· 2025-08-02 12:05
Core Insights - Zhejiang University Brain-Machine Intelligence National Key Laboratory has launched a new generation of neuromorphic brain-like computer named "Darwin Monkey" or "Wukong" [1] - The system supports over 2 billion pulse neurons and more than 100 billion synapses, approaching the scale of a macaque brain [1] - In typical operating conditions, the power consumption is approximately 2000 watts, making it the first neuromorphic computer with neuron scale exceeding 2 billion based on dedicated neuromorphic chips [1] - Neuromorphic computing aims to replicate the efficient computational mechanisms of biological neural networks to create low-power, high-parallel, high-efficiency, and intelligent computing systems [1]
浙江大学发布类脑计算机“悟空”
Zhong Guo Xin Wen Wang· 2025-08-02 10:27
Core Insights - Zhejiang University Brain-Machine Intelligence National Key Laboratory has launched a new generation of neuromorphic brain-like computer named Darwin Monkey (referred to as "Wukong") [1] - The neuromorphic computer supports over 2 billion pulse neurons, placing it among the top in the global neuromorphic computing systems [1] Technology Overview - Neuromorphic computing mimics the working mechanism of biological neural networks to create intelligent computing systems that are low-power, highly parallel, and efficient [1] - Wukong consists of 15 blade-type neuromorphic brain servers, each integrating 64 Darwin 3rd generation brain computing chips [1] - Each chip can support over 2.35 million pulse neurons and hundreds of millions of synapses, along with a dedicated instruction set for brain-like computing and online learning mechanisms [1] Performance Metrics - Wukong's synapse count exceeds 100 billion, and its neuron count is close to that of a macaque brain, with a typical operating power consumption of approximately 2000 watts [1] - The system has successfully run the DeepSeek brain-like large model, completing tasks such as logical reasoning and content generation, and can preliminarily simulate the brains of various animals from C. elegans to macaques [1]
中国学者连发4篇Cell论文,登上Cell期刊封面
生物世界· 2025-07-11 08:40
Core Insights - A significant collaborative research effort involving over 300 scientists from more than 30 institutions has published 10 papers on brain mapping in top-tier journals such as Cell and its sub-journals [2][3] Group 1: Research Findings - The studies reveal various cell types and their connections in the brains of mice and primates, with a notable cover image depicting a macaque gazing at a starry universe, symbolizing the brain's complexity [5] - The first multimodal atlas of the macaque claustrum has been created, identifying 48 cell types through single-nucleus RNA sequencing, highlighting the unique cell types in macaques compared to other species [6][7] - The research integrates single-cell transcriptomics, spatial data, and connectivity analysis, providing a comprehensive understanding of the claustrum's role as an "information hub" in the brain [8] Group 2: Methodological Innovations - A new high-speed imaging technique has been developed for whole-mouse peripheral nerves at subcellular resolution, allowing for unprecedented 3D mapping of the peripheral nervous system [16][20] - This technique enables the visualization of sensory and motor projections, revealing intricate structural features and pathways of the vagus nerve [20][21] Group 3: Implications for Future Research - The identification of cell type-specific enhancers in the macaque brain offers new tools for monitoring and manipulating neuronal activity, advancing the understanding of primate brain structure and cognitive principles [24][28] - The advancements in brain mapping technologies are expected to facilitate research into brain diseases and inspire developments in artificial intelligence systems [30]
上海海外联谊会走进杨浦 助力科技创新高地建设
Zhong Guo Xin Wen Wang· 2025-07-02 14:05
Core Viewpoint - The Shanghai Overseas Friendship Association organized an event in Yangpu to promote technological innovation and development opportunities, emphasizing collaboration with overseas Chinese and high-end talent introduction [1]. Group 1: Event Overview - The event titled "Gathering in Shanghai for Development" involved over 60 participants, including members of the Shanghai Overseas Friendship Association and representatives from Hong Kong business associations [1]. - The event included visits to online new economy enterprises and technology companies in Yangpu, as well as tours of the World Skills Museum and the "Fudan Source" Technology Achievement Museum [1]. Group 2: Government and Organizational Insights - Yangpu District officials highlighted the area's strengths using the terms "University, Big Factory, Big Material," and discussed the district's technological innovation through the "three clusters" approach [2]. - The delegation provided constructive feedback on various topics, including low-altitude economy, foreign publicity, community roles, talent policies, and enterprise interactions [2]. Group 3: Company Visits and Innovations - At the Ford China R&D Center, the delegation learned about Ford's development history, global design layout, and successful collaborations with domestic universities [3]. - The Meituan (Shanghai) Command Center showcased the application of big data, artificial intelligence, and cloud computing in its operations through interactive demonstrations [3]. - New Helium's chief scientist presented on brain-like computing architecture advantages and the current state of industry implementation, highlighting the company's role in Shanghai's brain chip development [6]. Group 4: Educational and Technological Exhibitions - The World Skills Museum provided insights into the history of skill development and the importance of technology in human progress through its interactive exhibits [8]. - The "Fudan Source" Technology Achievement Museum featured three thematic exhibition areas, showcasing Fudan University's significant technological innovations and the spirit of scientific pursuit [9].
具身智能推动实现通用人工智能
Group 1 - The core idea of embodied intelligence emphasizes that cognition is influenced by the agent's perception and actions, suggesting that intelligence arises from the interaction between the agent's body and the surrounding environment, rather than solely from brain function [1][2] - Embodied intelligence theory has profound implications across various fields such as cognitive science, psychology, anthropology, and art, leading to the emergence of sub-disciplines like embodied cognition and embodied psychology [1][2] - The transition from traditional disembodied intelligence to modern embodied intelligence marks a significant shift in artificial intelligence research, where the latter integrates physical interaction with the environment for learning and decision-making [2][3] Group 2 - The history of artificial intelligence has evolved through three stages: the first generation focused on knowledge-based reasoning models, the second generation introduced data-driven models, and the third generation, marked by the emergence of large language models, represents a new phase of development [3][4] - The introduction of large language models in 2020 has enabled machines to achieve free interaction with humans in open domains, indicating a significant step towards general artificial intelligence [4][5] - Despite advancements in language generation, there are still limitations in achieving domain generality across various tasks, particularly in complex areas like medical diagnosis, highlighting the need for embodied intelligence to bridge these gaps [5][6] Group 3 - The concept of embodied intelligence was first proposed in the field of robotics, emphasizing the importance of the interaction between the body and the environment in intelligent behavior [6][7] - Embodied intelligence has driven advancements in robotics technology, shifting from single-modal perception to multi-modal perception, which is crucial for applications like autonomous vehicles [8][9] - The integration of the agent concept in embodied intelligence allows robots to combine thinking, perception, and action, facilitating tasks in both digital and physical worlds, and enhancing the efficiency of robotic development through simulation [9]
光芯片,即将起飞!
半导体行业观察· 2025-06-09 00:53
Core Viewpoint - The rapid development of large language models (LLMs) is pushing the limits of contemporary computing hardware, necessitating exploration of alternative computing paradigms such as photonic hardware to meet the increasing computational demands of AI models [1][4]. Group 1: Photonic Hardware and Its Advantages - Photonic computing utilizes light for information processing, offering high bandwidth, strong parallelism, and low thermal dissipation, which are essential for next-generation AI applications [4][5]. - Recent advancements in photonic integrated circuits (PICs) enable the construction of fundamental neural network modules, such as coherent interferometer arrays and micro-ring resonator weight arrays, facilitating dense matrix multiplication and addition operations [4][5]. - The integration of two-dimensional materials like graphene and transition metal dichalcogenides (TMDCs) into silicon-based photonic platforms enhances the functionality of modulators and on-chip synaptic elements [5][31]. Group 2: Challenges in Mapping LLMs to New Hardware - Mapping transformer-based LLM architectures to new photonic hardware presents challenges, particularly in designing reconfigurable circuits for dynamic weight matrices that depend on input data [5][6]. - Achieving nonlinear functions and normalization in photonic or spintronic media remains a significant technical hurdle [5][6]. Group 3: Key Components and Technologies - Photonic neural networks (PNNs) leverage various optical devices, such as micro-ring resonators and Mach-Zehnder interferometer arrays, to perform efficient computations [9][13]. - The use of metasurfaces allows for high-density parallel optical computations by modulating light properties through sub-wavelength structured materials [14][16]. - The 4f optical systems enable linear filtering functions through Fourier transformation, integrating deep diffraction neural networks into optical architectures [20][21]. Group 4: Integration of Two-Dimensional Materials - The integration of graphene and TMDCs into photonic chips is crucial for developing high-speed and energy-efficient AI hardware, with applications in optical modulators, photodetectors, and waveguides [31][35][36]. - Graphene's exceptional optical and electronic properties, combined with TMDCs' tunable bandgap, enhance the performance of photonic devices, making them suitable for AI workloads [31][32]. Group 5: Future Directions and Challenges - The scalability of integrating two-dimensional materials poses challenges due to their fragility, necessitating advancements in transfer techniques and wafer-scale synthesis [45]. - Material stability and the complexity of integration with existing CMOS processes are critical factors that need to be addressed for widespread adoption of these technologies [45][46].
投融|千亿融资落地!千诀科技成功融资的三大关键因素
Sou Hu Cai Jing· 2025-06-04 00:33
Core Viewpoint - Qianjue Technology has successfully completed several rounds of financing, driven by its innovative "embodied brain" system, strong team capabilities, and significant market potential [2][16]. Group 1: Technological Advantages - The "embodied brain" system developed by Qianjue Technology showcases significant innovation in the field of robotic intelligence, emphasizing multi-modal real-time perception and autonomous execution capabilities without relying on pre-set strategies [3][7]. - In practical tests, the system demonstrated characteristics such as cross-environment adaptability and long-duration autonomous decision-making, capable of interacting with over twenty types of embodied hardware [5]. - The adoption of "brain-like computing" technology allows the system to integrate perception, reasoning, and behavior, marking a qualitative leap from passive execution to proactive planning [7]. Group 2: Team Strength - Qianjue Technology was incubated at Tsinghua University's Brain-like Center, led by pioneering researchers in brain-like computing, with a technical team primarily composed of master's and doctoral graduates from Tsinghua University [8][10]. - The deep integration of academic research and engineering practice enables the company to accurately grasp technological development directions and efficiently solve technical challenges, ensuring a continuous output of innovative technologies and products [10]. Group 3: Market Potential - The embodied intelligence market is experiencing significant growth potential, with Qianjue Technology's system already achieving stable operations in various scenarios such as home services, logistics, and commercial operations [11][13]. - The company's autonomous home service robot can operate for several hours until battery depletion, showcasing its stability and practicality [13]. - Collaborations with leading companies in the 3C industry have led to the creation of the world's largest pure real-sampling home scene dataset, further expanding its market application space [13][16].
人工智能至今仍不是现代科学,人们却热衷用四种做法来粉饰它
Guan Cha Zhe Wang· 2025-05-21 00:09
Group 1 - The term "artificial intelligence" was formally introduced at a conference in 1956 at Dartmouth College, marking the beginning of efforts to replicate human intelligence through modern science and technology [1] - Alan Turing is recognized as the father of artificial intelligence due to his introduction of the "Turing Test" in 1950, which provides a method to determine if a machine can exhibit intelligent behavior equivalent to a human [1][3] - The Turing Test involves a human evaluator interacting with an isolated "intelligent agent" through a keyboard and display, where if the evaluator cannot distinguish between the machine and a human, the machine is considered intelligent [3][5] Group 2 - The Turing Test is characterized as a subjective evaluation method rather than an objective scientific test, as it relies on human judgment rather than consistent measurable criteria [6][9] - Despite claims of machines passing the Turing Test, such as Eugene Goostman in 2014, there is no consensus that these machines possess human-like thinking capabilities, highlighting the limitations of the Turing Test as a scientific standard [6][8] - Turing's original paper contains subjective reasoning and speculative assertions, which, while valuable for exploration, do not meet the rigorous standards of scientific argumentation [8][9] Group 3 - The field of artificial intelligence has been criticized for lacking a solid scientific foundation, often relying on conjecture and analogy rather than empirical evidence [10][19] - The emergence of terms like "scaling law" in AI research reflects a trend of using non-scientific concepts to justify claims about machine learning performance, which may not hold true under scrutiny [16][17] - Historical critiques, such as those from Hubert L. Dreyfus in 1965, emphasize the need for a deeper scientific understanding of AI rather than superficial advancements based on speculative ideas [18][19] Group 4 - The ongoing development of AI as a practical technology has achieved significant progress, yet it remains categorized as a modern craft rather than a fully-fledged scientific discipline [20][21] - Future advancements in AI should adhere to the rational norms of modern science and technology, avoiding the influence of non-scientific factors on its development [21]