...
首页> 外文期刊>Presence >Human-robot interaction by understanding upper body gestures
【24h】

Human-robot interaction by understanding upper body gestures

机译:通过了解上身手势进行人机交互

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

In this paper, a human-robot interaction system based on a novel combination of sensors is proposed. It allows one person to interact with a humanoid social robot using natural body language. The robot understands the meaning of human upper body gestures and expresses itself by using a combination of body movements, facial expressions, and verbal language. A set of 12 upper body gestures is involved for communication. This set also includes gestures with human-object interactions. The gestures are characterized by head, arm, and hand posture information. The wearable Immersion CyberGlove II is employed to capture the hand posture. This information is combined with the head and arm posture captured from Microsoft Kinect. This is a new sensor solution for human-gesture capture. Based on the posture data from the CyberGlove II and Kinect, an effective and real-time human gesture recognition method is proposed. The gesture understanding approach based on an innovative combination of sensors is the main contribution of this paper. To verify the effectiveness of the proposed gesture recognition method, a human body gesture data set is built. The experimental results demonstrate that our approach can recognize the upper body gestures with high accuracy in real time. In addition, for robot motion generation and control, a novel online motion planning method is proposed. In order to generate appropriate dynamic motion, a quadratic programming (QP)-based dual-arms kinematic motion generation scheme is proposed, and a simplified recurrent neural network is employed to solve the QP problem. The integration of a handshake within the HRI system illustrates the effectiveness of the proposed online generation method.
机译:本文提出了一种基于新型传感器组合的人机交互系统。它允许一个人使用自然的肢体语言与人形社交机器人互动。该机器人理解人体上半身手势的含义,并通过结合身体动作,面部表情和口头语言来表达自己。涉及12个上半身手势的集合以进行通信。该集合还包括具有人与对象交互作用的手势。手势的特征是头部,手臂和手的姿势信息。可穿戴式Immersion Cyber​​Glove II用于捕获手势。此信息与从Microsoft Kinect捕获的头部和手臂姿势结合在一起。这是用于人类手势捕获的新传感器解决方案。基于Cyber​​Glove II和Kinect的姿态数据,提出了一种有效,实时的人体手势识别方法。基于传感器创新组合的手势理解方法是本文的主要贡献。为了验证所提出的手势识别方法的有效性,建立了人体手势数据集。实验结果表明,该方法可以实时,准确地识别上半身手势。另外,针对机器人运动的产生和控制,提出了一种新颖的在线运动计划方法。为了产生适当的动态运动,提出了一种基于二次规划(QP)的双臂运动运动产生方案,并采用简化的递归神经网络来解决QP问题。 HRI系统中的握手集成证明了该在线生成方法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号