首页> 外文会议> >Using visual and auditory feedback for instrument-playing humanoids
【24h】

Using visual and auditory feedback for instrument-playing humanoids

机译:使用视觉和听觉反馈来演奏类人动物

获取原文

摘要

In this paper, we present techniques that enable a humanoid to autonomously play instruments like the metallophone. The core of our approach is a model-based method to estimate the pose of the instrument and the beaters held by the robot using observations from the onboard camera. For accurate playing, we calibrate the kinematic parameters of the robot and find valid configurations of the arms for beating the individual sound bars of the instrument. To determine these, we rely on the estimated pose of the instrument and the beaters and apply inverse kinematics (IK). Hereby, we use precomputed forward kinematics solutions represented by a reachability tree to accelerate the IK computation and compensate for local minima. The robot automatically validates the computed IK configurations based on visual and auditory feedback using its sensors, and adapts its arm configurations if necessary. Our system parses MIDI-files of whole songs, maps the notes to the corresponding arm configurations for beating, and generates trajectories in joint space to hit the sound bars. As we show in the experiments with a Nao humanoid presented in this paper as well as in the accompanying video, our approach allows for clean and in-time playing of complete songs on a metallophone.
机译:在本文中,我们介绍了使类人动物能够自主演奏乐器(例如金属风琴)的技术。我们方法的核心是一种基于模型的方法,可以使用车载摄像头的观测值来估计仪器的姿态以及机器人握住的搅拌器。为了准确演奏,我们校准了机器人的运动学参数,并找到了有效的手臂配置以击败乐器的单个音棒。为了确定这些,我们依赖于仪器和拍子的估计姿势,并应用逆运动学(IK)。因此,我们使用由可达性树表示的预先计算的正向运动学解决方案来加速IK计算并补偿局部最小值。机器人会使用其传感器根据视觉和听觉反馈自动验证计算出的IK配置,并在必要时调整其手臂配置。我们的系统解析完整歌曲的MIDI文件,将音符映射到相应的手臂配置以进行跳动,并在关节空间中生成轨迹以击打条形音箱。正如我们在本文介绍的Nao类人动物的实验以及随附的视频中所显示的那样,我们的方法允许在金属风琴上以整洁的形式及时播放完整的歌曲。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号