首页> 外文会议>IEEE International Conference on Robotics & Automation >Real-time RGB-D based people detection and tracking for mobile robots and head-worn cameras
【24h】

Real-time RGB-D based people detection and tracking for mobile robots and head-worn cameras

机译:基于实时RGB-D的移动机器人和头戴式摄像机的人员检测和跟踪

获取原文
获取外文期刊封面目录资料

摘要

We present a real-time RGB-D based multiperson detection and tracking system suitable for mobile robots and head-worn cameras. Our approach combines RGB-D visual odometry estimation, region-of-interest processing, ground plane estimation, pedestrian detection, and multi-hypothesis tracking components into a robust vision system that runs at more than 20fps on a laptop. As object detection is the most expensive component in any such integration, we invest significant effort into taking maximum advantage of the available depth information. In particular, we propose to use two different detectors for different distance ranges. For the close range (up to 5–7m), we present an extremely fast depth-based upper-body detector that allows video-rate system performance on a single CPU core when applied to Kinect sensors. In order to cover also farther distance ranges, we optionally add an appearance-based full-body HOG detector (running on the GPU) that exploits scene geometry to restrict the search space. Our approach can work with both Kinect RGB-D input for indoor settings and with stereo depth input for outdoor scenarios. We quantitatively evaluate our approach on challenging indoor and outdoor sequences and show state-of-the-art performance in a large variety of settings. Our code is publicly available.
机译:我们提出了一种适用于移动机器人和头戴式摄像头的基于实时RGB-D的多人检测和跟踪系统。我们的方法将RGB-D视觉测距法估计,感兴趣区域处理,地平面估计,行人检测和多假设跟踪组件结合在一起,形成了一个健壮的视觉系统,该系统在笔记本电脑上的运行速度超过20fps。由于对象检测是任何此类集成中最昂贵的组件,因此我们投入大量精力来充分利用可用的深度信息。特别是,我们建议针对不同的距离范围使用两个不同的检测器。对于近距离(最远5-7m),我们提出了一种基于深度的极快的上身检测器,当将其应用于Kinect传感器时,可以在单个CPU内核上实现视频速率系统性能。为了覆盖更远的距离范围,我们可以选择添加基于外观的全身HOG检测器(在GPU上运行),该检测器利用场景几何来限制搜索空间。我们的方法既可以用于室内设置的Kinect RGB-D输入,也可以用于室外场景的立体声深度输入。我们对挑战性的室内和室外场景进行定量评估,并在各种环境中展现出最先进的性能。我们的代码是公开可用的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号