RoboMIND2.0
Search documents
北京人形XR-1模型开源 推动具身智能迈入“全自主、更好用”新阶段
Zheng Quan Ri Bao Wang· 2025-12-19 13:43
Core Insights - Beijing Humanoid Robot Innovation Center has officially open-sourced the first and only embodied VLA model XR-1 that has passed the national standard test for embodied intelligence, along with supporting data foundations RoboMIND 2.0 and the latest version of ArtVIP, aiming to advance the embodied intelligence industry towards a new stage of "fully autonomous and more usable" robots [1][5] Group 1: Technological Advancements - The XR-1 model is designed to address the core pain point in the embodied intelligence industry, which is the difficulty robots face in performing basic physical tasks due to the disconnect between visual perception and action execution [2] - XR-1 features multi-scenario, multi-body, and multi-task capabilities, along with high generalization advantages, enabling smooth humanoid control and precise manipulation of objects [2][3] - The RoboMIND 2.0 dataset has been upgraded to include over 300,000 operational trajectory data points and expanded to cover 11 different application scenarios, significantly enhancing the training capabilities for robots [3] Group 2: Collaborative Efforts and Applications - Beijing Humanoid has partnered with various organizations to deploy humanoid robots across different industries, including successful implementations in manufacturing and high-risk inspections [4] - The collaboration with Bayer aims to develop humanoid robots and embodied intelligence technologies for solid drug manufacturing, packaging, quality control, and logistics [4] - The overall strategy focuses on creating a comprehensive open-source ecosystem that allows companies and developers to innovate without starting from scratch, thereby accelerating the large-scale application of robots in various fields [4][5]