首页> 外文期刊>Engineering Applications of Artificial Intelligence >Gesture recognition system for real-time mobile robot control based on inertial sensors and motion strings
【24h】

Gesture recognition system for real-time mobile robot control based on inertial sensors and motion strings

机译:基于惯性传感器和运动弦的实时移动机器人控制手势识别系统

获取原文
获取原文并翻译 | 示例

摘要

Navigating and controlling a mobile robot in an indoor or outdoor environment by using a range of body-worn sensors is becoming an increasingly interesting research area in the robotics community. In such scenarios, hand gestures offer some unique capabilities for human-robot interaction inherent to nonverbal communication with features and application scenarios not possible with the currently predominant vision-based systems. Therefore, in this paper, we propose and develop an effective inertial-sensor-based system, worn by the user, along with a microprocessor and wireless module for communication with the robot at distances of up to 250 m. Possible features describing hand-gesture dynamics are introduced and their feasibility is demonstrated in an off-line scenario by using several classification methods (e.g., random forests and artificial neural networks). Refined motion features are then used in K-means unsupervised clustering for motion primitive extraction, which forms the motion strings used for real-time classification. The system demonstrated an F1 score of 90.05% with the possibility of gesture spotting and null class classification (e.g., undefined gestures were discarded from the analysis). Finally, to demonstrate the feasibility of the proposed algorithm, it was implemented in an Arduino-based 8-bit ATmega2560 microcontroller for control of a mobile, tracked robot platform.
机译:在室内或室外环境中,通过使用一系列的穿戴式传感器来导航和控制移动机器人正成为机器人界越来越感兴趣的研究领域。在这样的场景中,手势为非语言通信所固有的人机交互提供了一些独特的功能,这些特性和应用场景是当前主要的基于视觉的系统无法实现的。因此,在本文中,我们提出并开发了一种有效的基于惯性传感器的系统,供用户穿戴,并与微处理器和无线模块一起用于与机器人进行长达250 m的通信。介绍了描述手势动态的可能特征,并通过使用几种分类方法(例如,随机森林和人工神经网络)在离线场景中证明了其可行性。然后,将经过精炼的运动特征用于K-均值无监督聚类中,以进行运动图元提取,从而形成用于实时分类的运动字符串。系统显示F1分数为90.05%,并且有可能发现手势和进行空类分类(例如,从分析中丢弃未定义的手势)。最后,为了证明所提出算法的可行性,该算法在基于Arduino的8位ATmega2560微控制器中实现,用于控制移动的跟踪机器人平台。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号