只演示一次,机器人就会干活了?北大&BeingBeyond联合团队用“分层小脑+仿真分身”让G1零样本上岗
量子位·2025-11-13 09:25

Core Insights - The article introduces the DemoHLM framework, which allows humanoid robots to generate extensive training data from a single human demonstration in a simulated environment, addressing key challenges in loco-manipulation [1][22]. Group 1: Challenges in Humanoid Robot Manipulation - Humanoid robot manipulation faces a "triple dilemma" due to limitations in existing solutions, which either rely on simulation or require extensive real-world remote operation data, making them impractical for complex environments like homes and industries [3][6]. - Traditional methods suffer from low data efficiency, poor task generalization, and difficulties in sim-to-real transfer, leading to high costs and limited scalability [6][20]. Group 2: Innovations of DemoHLM - DemoHLM employs a hierarchical control architecture that separates motion control from task decision-making, enhancing both flexibility and stability [7][20]. - The framework's key innovation is the ability to generate a vast amount of diverse training data from just one demonstration, significantly improving data efficiency and generalization capabilities [8][20]. Group 3: Experimental Validation - Comprehensive validation was conducted in both simulated environments (IsaacGym) and on the real Unitree G1 robot, covering ten manipulation tasks with notable success rates [9][19]. - As synthetic data volume increased from 100 to 5000, success rates for tasks improved significantly, demonstrating the effectiveness of the data generation pipeline [14][20]. Group 4: Industry Implications and Future Directions - DemoHLM's advancements provide critical technical support for the practical application of humanoid robots, reducing training costs and enhancing generalization across various scenarios [19][20]. - The framework is designed to be compatible with future upgrades, such as tactile sensors and multi-camera perception, paving the way for more complex operational environments [21][20].