Workflow
视触觉传感
icon
Search documents
行业点评报告:视触觉传感:特斯拉灵巧手“最后一块拼图”
KAIYUAN SECURITIES· 2025-08-18 05:47
Investment Rating - The industry investment rating is "Positive" (maintained) [1] Core Insights - The report highlights the significance of tactile sensors in optimizing Tesla's Optimus robotic hand, emphasizing that the engineering effort for the hand constitutes half of the overall development [16][18] - Vision-tactile technology is identified as a key method for enhancing robotic interaction with the physical world, providing high-resolution tactile information through optical imaging and tactile perception [5][18] - Domestic companies are rapidly catching up with international leaders in tactile technology, leveraging local supply chains and application scenarios [7][47] Summary by Sections Section 1: Vision-Tactile Sensing - Tactile sensors are crucial for the optimization of Tesla's Optimus robotic hand, which features 22 degrees of freedom and tactile sensors on all fingers [16][17] - Vision-tactile technology captures microscopic deformations during contact with objects, converting them into high-resolution tactile data, enabling robots to perform delicate operations [18][22] - The technology allows for the simultaneous perception of normal force, shear force, object pose, and texture, closely mimicking human tactile capabilities [33] Section 2: Domestic Companies Catching Up - GelSight is recognized as the global leader in vision-tactile sensors, with significant advancements made by domestic startups in material design and deep learning models [7][47] - Notable companies include: - **叠动科技 (Diedong Technology)**: Innovated the integration of MEMS technology with vision-tactile sensing, receiving strategic investment from 隆盛科技 (Longsheng Technology) [49][50] - **帕西尼 (Paxini)**: A leader in multi-dimensional tactile technology, has received over 100 million yuan in strategic investment from BYD [56][61] - **一目科技 (Yimu Technology)**: Developed the world's first full-stack tactile system designed for precise operations, supported by 松霖科技 (Songlin Technology) [62] Section 3: Investment Recommendations - Recommended stocks include 隆盛科技 (Longsheng Technology) as a beneficiary of the advancements in vision-tactile technology [8]
Science Robotics|耶鲁大学开源视触觉新范式,看出机器人柔性手的力感知
机器人圈· 2025-07-08 10:36
Core Viewpoint - Yale University's research introduces a new paradigm called "Forces for Free" (F3) in the field of robotic tactile sensing, enabling robots to estimate contact forces using only standard RGB cameras, thus achieving force perception capabilities at almost zero additional hardware cost [1][2][22]. Group 1: Challenges in Force Perception - Traditional high-precision force/torque (F/T) sensors are expensive, bulky, and prone to damage, while integrated fingertip tactile sensors face issues like complex wiring and limited information [2]. - Recent advancements in visual-tactile sensing technology offer new solutions by using visual signals to infer tactile information, but many existing methods require embedded markers or custom sensor skins [2]. Group 2: F3 Gripper Design - The F3 Gripper is optimized from Yale's open-source T42 gripper to enhance visual force estimation signal-to-noise ratio [6]. - Key optimizations include maximizing kinematic manipulability to avoid singular configurations and minimizing friction and hysteresis by replacing metal pins with miniature ball bearings, reducing internal friction from approximately 4.0N to 0.6N [6][7][9]. Group 3: Estimation Algorithm - A sophisticated deep learning estimator is developed to decode precise force information from image sequences, utilizing a CNN-Transformer architecture to handle temporal memory and spatial features [10][11]. - The model processes a sequence of images to mitigate the "same shape, different force" issue and employs a visual foundation model (SAM) to enhance robustness against visual interference [11][13]. Group 4: Experimental Validation - The system demonstrates effective performance in static force prediction tasks with estimation errors between 0.2N and 0.4N, significantly outperforming previous works [14]. - In dynamic closed-loop control experiments, the estimator successfully guided robots through complex tasks such as pin insertion, surface wiping, and calligraphy writing, achieving average force errors as low as 0.15N [21][22]. Group 5: Future Implications - This research presents a practical solution for low-cost robotic force perception, redefining the cost-effectiveness of visual-tactile sensing [22]. - Future expansions may include three-dimensional force/torque estimation and multi-finger dexterous hands, potentially broadening the application of advanced force control technologies in various robotic platforms [22].