首页> 外文会议>IEEE-RAS International Conference on Humanoid Robots >Learning Efficient Omni-Directional Capture Stepping for Humanoid Robots from Human Motion and Simulation Data
【24h】

Learning Efficient Omni-Directional Capture Stepping for Humanoid Robots from Human Motion and Simulation Data

机译:从人体运动和仿真数据中学习有效的全向人形机器人步进捕获

获取原文

摘要

Two key questions in the context of stepping for push recovery are where to step and how to step there. In this paper we present a fast and computationally light-weight approach for capture stepping of full-sized humanoid robots. To this end, we developed an efficient parametric step motion generator based on dynamic movement primitives (DMPs) learnt from human demonstrations. Simulation-based reinforcement learning (RL) is used to find a mapping from estimated push parameters (push direction and intensity) to step parameters (step location and step execution time) that are fed to the motion generator. Successful omni-directional capture stepping for 89 % of the test cases with pushes from various directions and intensities is achieved with minimal computational effort after 500 training iterations. We evaluate our method in a dynamic simulation of the ARMAR-4 humanoid robot.
机译:在逐步执行推式恢复的过程中,两个关键问题是在何处步进以及如何在此步进。在本文中,我们提出了一种快速且轻量级的方法来捕获全尺寸人形机器人的脚步。为此,我们基于从人类演示中学到的动态运动原语(DMP),开发了一种高效的参数步进运动发生器。基于仿真的强化学习(RL)用于查找从估计的推入参数(推入方向和强度)到馈入到运动生成器的阶跃参数(阶跃位置和阶跃执行时间)的映射。经过500次训练迭代后,只需最少的计算工作量,即可通过各种方向和强度进行成功的测试,成功完成了89%的测试用例的全方位捕获。我们在ARMAR-4人形机器人的动态仿真中评估了我们的方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号