首页> 外文会议>Human-Robot Interaction (HRI 2010), 2010 >Modular control for human motion analysis and classification in Human-Robot interaction
【24h】

Modular control for human motion analysis and classification in Human-Robot interaction

机译:人机交互中用于人体运动分析和分类的模块化控制

获取原文
获取外文期刊封面目录资料

摘要

Trajectories followed by the humans can be interpreted as an attitude gesture. Based on this interpretation an autonomous mobile robot can decide how to initiate interaction with a given human. In this work is presented a modular control system to analyze human walking trajectories in order to engage a robot on a Human-Robot interaction. When the robot detects a human with their vision system a visual tracking module begins to work over the Pan/Tilt/Zoom (PTZ) camera unit. Camera parameters configuration and global robot localization are then used by another module to filter and track human's legs over the laser range finder (LRF) data. Path followed by the human over the global reference frame is then processed by another module which determines the kind of attitude showed by the human. Based on the result the robot decides if an interaction is needed and who is expected to begin it. At this moment are used only three kinds of attitudes: confidence, curiosity and nervousness.
机译:人类遵循的轨迹可以解释为一种姿态手势。基于这种解释,自主移动机器人可以决定如何启动与给定人类的交互。在这项工作中,提出了一种模块化控制系统来分析人类的行走轨迹,以便使机器人参与人机交互。当机器人使用其视觉系统检测到人时,视觉跟踪模块开始在平移/倾斜/缩放(PTZ)摄像机单元上工作。然后,另一个模块使用相机参数配置和全局机械手本地化功能,通过激光测距仪(LRF)数据过滤和跟踪人的腿。然后,人在全局参考框架上所遵循的路径由另一个模块处理,该模块确定人所显示的态度类型。机器人根据结果确定是否需要进行交互,以及由谁来开始交互。目前只使用三种态度:自信,好奇心和紧张感。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号