Workflow
类脑计算
icon
Search documents
高性能计算群星闪耀时
雷峰网· 2025-08-18 11:37
Core Viewpoint - The article emphasizes the critical role of high-performance computing (HPC) in the development and optimization of large language models (LLMs), highlighting the synergy between hardware and software in achieving efficient model training and inference [2][4][19]. Group 1: HPC's Role in LLM Development - HPC has become essential for LLMs, with a significant increase in researchers from HPC backgrounds contributing to system software optimization [2][4]. - The evolution of HPC in China has gone through three main stages, from self-developed computers to the current era of supercomputers built with self-developed processors [4][5]. - Tsinghua University's HPC research institute has played a pioneering role in China's HPC development, focusing on software optimization for large-scale cluster systems [5][11]. Group 2: Key Figures in HPC and AI - Zheng Weimin is recognized as a pioneer in China's HPC and storage fields, contributing significantly to the development of scalable storage solutions and cloud computing platforms [5][13]. - The article discusses the transition of Tsinghua's HPC research focus from traditional computing to storage optimization, driven by the increasing importance of data handling in AI applications [12][13]. - Key researchers like Chen Wenguang and Zhai Jidong have shifted their focus to AI systems software, contributing to the development of frameworks for optimizing large models [29][31]. Group 3: Innovations in Model Training and Inference - The article details the development of the "Eight Trigrams Furnace" system for training large models, which significantly improved the efficiency of training processes [37][39]. - Innovations such as FastMoE and SmartMoE frameworks have emerged to optimize the training of mixture of experts (MoE) models, showcasing the ongoing advancements in model training techniques [41][42]. - The Mooncake and KTransformers systems have been developed to enhance inference efficiency for large models, utilizing shared storage to reduce computational costs [55][57].
我科学家研发新一代神经拟态类脑计算机
Ren Min Ri Bao· 2025-08-15 21:46
Core Insights - Zhejiang University has launched a new generation of neuromorphic brain-like computer named "Darwin Monkey" or "Wukong" [1] - The computer is based on specialized neuromorphic chips, supporting over 2 billion pulse neurons and over 100 billion synapses, closely resembling the scale of a macaque monkey's brain [1] Neuromorphic Computing - Neuromorphic computing applies the working mechanisms of biological neural networks to computer system design, aiming to create low-power, highly parallel, efficient, and intelligent computing systems [1] - "Wukong" is equipped with 960 Darwin 3rd generation neuromorphic computing chips developed in collaboration with the ZheJiang University and the ZhiJiang Laboratory [1] - Each chip supports over 2.35 million pulse neurons and hundreds of millions of synapses, along with a dedicated instruction set for brain-like computing and an online learning mechanism [1] Technological Breakthroughs - "Wukong" has achieved breakthroughs in key technologies such as large-scale neural interconnection and integration architecture [1]
多突触神经元模型问世,国内团队打造类脑计算新引擎,登上《自然·通讯》
机器之心· 2025-08-15 03:29
当前人工智能技术迅猛发展的同时,其高能耗问题也日益凸显。脉冲神经网络(Spiking Neural Networks, SNNs)被认为是一种更具生物合理性、能效更高的计算 范式。 然而,目前业界仍缺乏一种在计算效率和生物合理性之间实现良好平衡的脉冲神经元模型,这成为制约 SNNs 发展与应用的关键问题之一。 具体而言,现有的脉冲神经元模型 —— 包括 泄漏积分发放 (Leaky Integrate-and-Fire, LIF)、自适应 LIF(Adaptive LIF, ALIF)、霍奇金-赫胥黎(Hodgkin- Huxley, HH)以及多室模型(Multi-compartment models)—— 主要关注于模拟神经元的动态行为,并假设神经元之间仅通过单个突触(即单通道)连接。 由于脉冲神经元的信息表示方式是二值化的,单通道连接方式使得 SNNs 难以同时编码输入信号的空间强度分布与时间动态性。这种信号编码过程中出现的信息损 失使得 SNNs 在时空计算任务中的性能难以匹敌甚至超越连续值人工神经网络(ANNs)。 近日, 国防科技大学智能科学学院胡德文课题组与中国科学院自动化研究所李国齐课题组合作提 ...
告别Transformer,重塑机器学习范式:上海交大首个「类人脑」大模型诞生
机器之心· 2025-08-13 09:29
Core Viewpoint - The article discusses the introduction of BriLLM, a new language model inspired by human brain mechanisms, which aims to overcome the limitations of traditional Transformer-based models, such as high computational demands, lack of interpretability, and context size restrictions [3][8]. Group 1: Limitations of Current Models - Current Transformer-based models face three main issues: high computational requirements, black-box interpretability, and context size limitations [6][8]. - The self-attention mechanism in Transformers has a time and space complexity of O(n²), leading to increased computational costs as input length grows [7]. - The internal logic of Transformers lacks transparency, making it difficult to understand the decision-making process within the model [7][8]. Group 2: Innovations of BriLLM - BriLLM introduces a new learning mechanism called SiFu (Signal Fully-connected Flowing), which replaces traditional prediction operations with signal transmission, mimicking the way neural signals operate in the brain [9][13]. - The model architecture is based on a directed graph, allowing all nodes to be interpretable, unlike traditional models that only provide limited interpretability at the input and output layers [9][19]. - BriLLM supports unlimited context processing without increasing model parameters, allowing for efficient handling of long sequences [15][16]. Group 3: Model Specifications - BriLLM has two versions: BriLLM-Chinese and BriLLM-English, with non-sparse model sizes of 16.90 billion parameters for both languages [21]. - The sparse version of the Chinese model has 2.19 billion parameters, while the English version has 0.96 billion parameters, achieving a parameter reduction of approximately 90% [21]. - The model's design allows for the integration of multiple modalities, enabling it to process not just language but also visual and auditory inputs [25][26]. Group 4: Future Prospects - The team aims to develop a multi-modal brain-inspired AGI framework, which will integrate perception and motion [27]. - BriLLM has been selected for funding under Shanghai Jiao Tong University's "SJTU 2030" plan, which supports groundbreaking research projects [27].
中芯国际产能供不应求;传SK海力士HBM4大幅涨价;传三星DDR4停产延后…一周芯闻汇总(8.4-8.10)
芯世相· 2025-08-11 06:46
Key Events - Trump announced that the U.S. will impose approximately 100% tariffs on chips and semiconductors [7] - WSTS reported that the global semiconductor market size is expected to grow by 18.9% year-on-year in the first half of 2025, reaching $346 billion [10] - SMIC's Zhao Haijun stated that the capacity shortage will last at least until October this year [7][14] - Samsung is reportedly extending its DDR4 production plan until December 2026 [7][18] - SK Hynix has significantly raised the pricing for HBM4 [7][19] Industry Trends - The Chinese government is pushing for breakthroughs in key brain-machine interface chips, focusing on high-speed, low-power signal processing [9] - Shanghai is accelerating the development of specialized chips for embodied intelligence [9] - The global semiconductor sales in Q2 2025 are projected to reach $179.7 billion, with a year-on-year growth of nearly 20% [11] Company Updates - SMIC reported Q2 revenue of $2.21 billion, a 16% year-on-year increase, with a capacity utilization rate of 92.5% [13][14] - Hua Hong Semiconductor achieved a Q2 revenue of $566 million, with a gross margin of 10.9% [13] - Samsung is investing in a new 1c DRAM production line, aiming for a monthly capacity of 150,000 to 200,000 wafers by mid-next year [15] Market Dynamics - The average trading price of PC DRAM products has increased for four consecutive months, with July's price reaching $3.90, a 50% month-on-month increase [19] - The advanced IC substrate market is expected to reach $31 billion by 2030, driven by AI and other emerging applications [11] Technological Advancements - Zhejiang University announced a breakthrough in neuromorphic computing with the launch of a new generation of brain-like computers, supporting over 2 billion neurons [21]
浙大打造全球最大类脑计算机,拥有20亿个神经元,接近猕猴大脑规模,能运行DeepSeek
量子位· 2025-08-04 07:00
Core Viewpoint - Zhejiang University has developed the world's largest neuromorphic computer, "Darwin Monkey (悟空)," which utilizes the third-generation brain-like chip Darwin 3, featuring over 2 billion spiking neurons and 100 billion synaptic connections, significantly advancing artificial intelligence and neuroscience modeling capabilities [1][2][19]. Group 1: Computer Specifications - The "Darwin Monkey" computer is built on the Darwin 3 chip, which supports over 2 billion spiking neurons, closely approaching the neuron count of a macaque brain [1][6]. - Each Darwin 3 chip can handle over 2.35 million spiking neurons and hundreds of millions of synapses, utilizing a 24x24 two-dimensional node grid architecture for efficient inter-node communication [6][8]. - The chip operates on an event-driven architecture, activating only when necessary, which reduces power consumption to as low as 5.47 picojoules per synaptic operation [13]. Group 2: Technological Innovations - Darwin 3 features a specialized instruction set architecture (ISA) that includes 10 main instructions for efficient processing of various spiking neuron models and learning rules [9][10]. - The chip employs an innovative connection representation mechanism that significantly compresses storage requirements while enhancing the maximum fan-in and fan-out capabilities by 1024 and 2048 times, respectively [11]. - The integration of 2.5D advanced packaging technology allows for the direct packaging of 64 Darwin 3 chips on a single 12-inch wafer, improving interconnect speed and reducing power consumption [18]. Group 3: Applications and Implications - The "Darwin Monkey" has successfully deployed intelligent applications, including DeepSeek, and has simulated various animal brains, providing new tools for neuroscience research [19][23]. - This computer not only serves as a foundation for AI development but also offers neuroscientists a means to explore brain mechanisms, potentially reducing reliance on biological experiments [23][24]. - The capabilities of the "Darwin Monkey" are expected to surpass human brain computational speeds, supporting future research in brain-like artificial intelligence [24].
晚报 | 8月4日主题前瞻
Xuan Gu Bao· 2025-08-03 14:15
Group 1: Photovoltaics - The Ministry of Industry and Information Technology issued a notice on the energy-saving inspection tasks for the polysilicon industry for 2025, aiming to reduce the burden on enterprises [1] - Silicon wafer prices continued to rise, with an average increase of approximately 0.1 yuan per piece, driven by rising raw material costs and increased downstream orders [1] - Citic Securities indicated that the photovoltaic industry is a core area for addressing issues of homogenization and overcapacity, with potential for price recovery and profit restoration as the industry returns to orderly competition [1] Group 2: Robotics - The 2025 World Robot Conference will be held in Beijing from August 8 to 12, featuring over 1,500 exhibits from more than 200 domestic and international robot companies, with nearly double the number of new products launched compared to last year [2] - Huaxi Securities noted that humanoid robots are on the verge of commercial explosion, driven by policy support, technological maturity, and increasing demand, with the global market expected to exceed $150 billion by 2035 [2] Group 3: Nuclear Fusion - Helion Energy announced the commencement of site construction for its first controllable nuclear fusion power plant, marking a significant step towards the commercialization of nuclear fusion technology [3] - Nuclear fusion is viewed as a key to achieving "energy freedom," providing a clean and virtually limitless energy source with minimal carbon emissions [3] Group 4: Brain Science - Zhejiang University unveiled the Darwin Monkey, a new generation of neuromorphic brain-like computer, featuring over 2 billion pulse neurons and a power consumption of approximately 2000 watts [4] - Brain-like computing is positioned as a frontier field at the intersection of artificial intelligence and neuroscience, with potential for commercialization in specific areas within 3-5 years [4] Group 5: Solid-State Batteries - SAIC confirmed that the new MG4 electric hatchback will be the world's first mass-produced electric vehicle equipped with a semi-solid-state battery, featuring a 70kWh battery with an energy density of 180 Wh/kg and a range of 537 kilometers [5] - The semi-solid-state battery shows improved performance in low temperatures compared to traditional lithium iron phosphate batteries [5] Group 6: Smart Factories - The Ministry of Industry and Information Technology released a digital transformation implementation plan for the machinery industry, aiming for 50% of enterprises to achieve a maturity level of two or above by 2027 [6] - The smart factory market surpassed 1 trillion yuan in 2022, with significant penetration in automotive and electronics sectors, and is expected to expand further with the integration of AI and other new technologies [6] Group 7: Gold Market - Gold prices surged due to changes in U.S. tariff policies and disappointing employment data, with spot gold reaching $3362.64 per ounce, up 2.22% [6] - The market anticipates a potential interest rate cut by the Federal Reserve in November, further enhancing gold's appeal as a safe-haven asset amid ongoing geopolitical tensions [6] Group 8: Macro and Industry News - The People's Bank of China emphasized support for stable capital market operations and financing for technology-driven SMEs [7] - The Ministry of Industry and Information Technology and other departments issued a digital transformation plan for the machinery industry [7] - The 2025 World Robot Expo will feature the largest number of humanoid robot manufacturers in similar events [7]
自动驾驶车祸致1死1伤,特斯拉被判赔2.43亿美元;乘龙卡车再发两张海报内涵理想;库克成为苹果公司史上任期最长CEO丨邦早报
创业邦· 2025-08-03 01:10
Group 1 - Tesla was ordered to pay $243 million in damages for a fatal accident involving its autonomous driving technology, with the jury finding Tesla partially responsible due to technology failure [1] - The compensation includes $200 million in punitive damages and $43 million related to the accident [1] Group 2 - The State Administration for Market Regulation in China released a compliance guide for online trading platforms, emphasizing the obligation to publicly disclose fee rules and prohibiting double charging [4] - The guide mandates that platforms cannot charge fees without providing services and must not impose unreasonable fees on operators [4] Group 3 - Tim Cook has become the longest-serving CEO of Apple, surpassing Steve Jobs with 5,091 days in office as of August 1, 2025 [4] - Cook emphasized the importance of seizing opportunities in artificial intelligence during a rare all-hands meeting [10] Group 4 - Berkshire Hathaway reported a net profit of $12.37 billion for Q2 2025, a nearly 60% decrease year-over-year, with total revenue at $92.515 billion [14] - The company did not repurchase any Class A or B shares during the quarter [14] Group 5 - China's automotive industry saw a 6.5% year-over-year increase in vehicle sales in July 2025, with a significant rise in sales of self-owned and new energy vehicles [31] - The production of smart phones in China reached 563 million units in the first half of 2025, reflecting a 0.5% year-over-year growth [32]
突破性进展!国际首台,研制成功
中国基金报· 2025-08-03 00:25
Core Viewpoint - The article discusses the launch of "Wukong," the world's largest neuromorphic brain-like computer developed by Zhejiang University, which features over 2 billion neurons and more than 100 billion synapses, marking a significant advancement in brain-inspired computing technology [2][3]. Group 1 - "Wukong" is the first neuromorphic brain-like computer with a neuron scale exceeding 2 billion, surpassing Intel's Hala Point system, which has 1.15 billion neurons [2]. - The computer operates at approximately 2000 watts under typical conditions, showcasing its low power consumption compared to traditional computing systems [2]. - "Wukong" is built on 960 Darwin 3rd generation brain-like computing chips, organized into 15 blade-style neuromorphic servers [2][3]. Group 2 - The research team has developed a new generation Darwin brain-like operating system alongside "Wukong," enabling it to run various intelligent applications, including DeepSeek brain-like models for logical reasoning, content generation, and mathematical problem-solving [3]. - "Wukong" can simulate the brain structures of various animals, including nematodes, zebrafish, mice, and macaques, providing a versatile platform for neuroscience research [3][4]. - The computer's high parallelism and efficiency are expected to offer new computational paradigms for existing computing scenarios and support the development of artificial intelligence [3][4].
我国神经拟态类脑计算突破性进展
财联社· 2025-08-02 14:37
Core Viewpoint - The article discusses the launch of a new generation of neuromorphic brain-like computer named "Darwin Monkey" (referred to as "Wukong") by Zhejiang University, which features over 2 billion pulse neurons and more than 100 billion synapses, closely resembling the scale of a macaque brain [1] Group 1 - "Wukong" is the first neuromorphic computer internationally to exceed 2 billion neurons based on dedicated neuromorphic chips [1] - The computer operates at approximately 2000 watts under typical running conditions, showcasing its efficiency [1] - Neuromorphic computing aims to apply the mechanisms of biological neural networks to computer system design, creating low-power, highly parallel, efficient, and intelligent computing systems [1]