首页> 外文会议>IEEE/RSJ International Conference on Intelligent Robots and Systems >Auditory robotic tracking of sound sources using hybrid cross-correlation and recurrent networks
【24h】

Auditory robotic tracking of sound sources using hybrid cross-correlation and recurrent networks

机译:使用混合互相关和复发网络的声音机器人跟踪声音源

获取原文

摘要

This paper describes an auditory robotic system capable of computing the angle of incidence of a sound source on the horizontal plane (azimuth). The system, with the use of an Elman type recurrent neural network (RNN), is able to dynamically track this sound source as it changes azimuthally within the environment. The RNN is used to enable fast tracking responses to the overall system over a set time, as opposed to waiting for the next sound position before moving. The system is first tested in a simulated environment and then these results are compared with testing on the robotic system. The results show that the development of a hybrid system incorporating cross-correlation and recurrent neural networks is an effective mechanism for the control of a robot that tracks sound sources azimuthally.
机译:本文介绍了一种能够计算水平面(方位角)上声源的入射角的听觉机器人系统。通过使用ELMAN类型的经常性神经网络(RNN),系统能够动态跟踪该声源,因为它在环境中方四方更改。 RNN用于在设定的时间内能够快速跟踪对整个系统的响应,而不是在移动之前等待下一个声音位置。系统首先在模拟环境中进行测试,然后将这些结果与机器人系统的测试进行比较。结果表明,结合互相关和复发性神经网络的混合系统的开发是控制追踪声源方位相传的机器人的有效机制。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号