类脑感知
Search documents
类脑感知,机器人导航新帮手
Ren Min Ri Bao· 2025-10-27 08:04
Core Insights - The article discusses a new navigation system called LENS developed by researchers at Queensland University of Technology, which mimics human brain perception to enable robots to navigate without GPS or high energy consumption [1][2]. Group 1: Technology and Innovation - LENS is inspired by the brain's neural information encoding, aiming to allow robots to perform complex tasks with minimal energy, similar to the human brain's efficiency of processing information with just 20 watts [1]. - The system utilizes a novel "dynamic vision sensor" or "event camera" that activates pixels only when detecting changes in brightness or motion, significantly reducing unnecessary energy consumption [1][2]. - The neural architecture designed for LENS processes information through electrical pulses, simulating real neuronal signal transmission and enabling adaptive learning [2]. Group 2: Performance and Applications - LENS operates with energy consumption less than 10% of traditional navigation systems and occupies only 180 KB of storage, making it highly efficient for recognizing locations within an 8-kilometer range [2]. - The system is particularly advantageous in environments where traditional navigation methods fail, such as disaster sites, tunnels, dense forests, or extraterrestrial locations, as it does not rely on external positioning support [2]. - Initial tests indicate that LENS demonstrates comparable positioning accuracy and system stability to traditional navigation methods [2]. Group 3: Future Development and Challenges - The LENS system is still in the research and development phase, with potential for significant advancements as processor performance, sensor accuracy, and algorithm models improve [3]. - Key challenges for widespread application include enhancing the stability of the system in response to non-continuous event information and improving the capabilities of neuromorphic processors [3]. - Successful industrialization of this technology will depend on achieving deep collaboration among various sensory modalities, efficient support from brain-like chips, and the continuous evolution of adaptive algorithms [3].
类脑感知,机器人导航新帮手(创新汇)
Ren Min Ri Bao· 2025-10-26 22:03
Core Insights - A new navigation system named LENS has been developed by researchers at Queensland University of Technology, which mimics human brain perception to enable robots to navigate without GPS or high energy consumption [1][2] - The system utilizes a novel type of camera called "dynamic vision sensor" or "event camera," which only activates pixels when detecting changes in brightness or motion, significantly reducing unnecessary energy consumption [1][2] - LENS operates with energy consumption less than 10% of traditional navigation systems and occupies only 180 KB of storage, making it highly efficient for use in complex environments [2] Group 1 - The LENS system represents a shift from traditional navigation methods that rely on pre-set high-precision maps and large computing power, focusing instead on real-time environmental adaptation and energy efficiency [3] - The system has shown preliminary performance comparable to traditional navigation methods in terms of positioning accuracy and system stability under various testing conditions [2][3] - Future developments aim to expand the recognition range of LENS and integrate it into lightweight vehicles or wearable devices for enhanced adaptability and longer endurance in diverse mobile scenarios [3] Group 2 - The core breakthrough of the LENS system lies in its ability to operate without external positioning support, making it suitable for applications in signal-blind areas such as disaster sites, tunnels, and remote locations [2] - The research team acknowledges that the system is still in the development phase, with potential for significant advancements as processor performance, sensor accuracy, and algorithm models improve [3] - Key challenges for widespread application include enhancing the stability of the system in real-world environments and achieving deep collaboration among various perception modalities [3]