首页> 外文会议>ECCV 2004 Workshop on Computer Vision in Human-Computer Interaction(HCI); 20040516; Prague; CZ >Real-Time Person Tracking and Pointing Gesture Recognition for Human-Robot Interaction
【24h】

Real-Time Person Tracking and Pointing Gesture Recognition for Human-Robot Interaction

机译:人机交互的实时人员跟踪与手势识别

获取原文
获取原文并翻译 | 示例

摘要

In this paper, we present our approach for visual tracking of head, hands and head orientation. Given the images provided by a calibrated stereo-camera, color and disparity information are integrated into a multi-hypotheses tracking framework in order to find the 3D-positions of the respective body parts. Based on the hands' motion, an HMM-based approach is applied to recognize pointing gestures. We show experimentally, that the gesture recognition performance can be improved significantly by using visually gained information about head orientation as an additional feature. Our system aims at applications in the field of human-robot interaction, where it is important to do run-on recognition in real-time, to allow for robot's egomotion and not to rely on manual initialization.
机译:在本文中,我们介绍了视觉跟踪头部,手和头部方向的方法。给定由校准的立体相机提供的图像,将颜色和视差信息集成到多假设跟踪框架中,以便找到各个身体部位的3D位置。基于手的动作,基于HMM的方法可用于识别指向手势。我们通过实验表明,通过使用有关头部方向的视觉获取信息作为附加功能,可以显着提高手势识别性能。我们的系统针对人机交互领域中的应用,在这些应用中,实时进行连续识别非常重要,以允许机器人进行自我运动,而不必依靠手动初始化。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号