...
【24h】

Interactive Motion Mapping for Real-time Character Control

机译:用于实时字符控制的交互式运动映射

获取原文
获取原文并翻译 | 示例

摘要

It is now possible to capture the 3D motion of the human body on consumer hardware and to puppet in real time skeleton-based virtual characters. However, many characters do not have humanoid skeletons. Characters such as spiders and caterpillars do not have boned skeletons at all, and these characters have very different shapes and motions. In general, character control under arbitrary shape and motion transformations is unsolved - how might these motions be mapped? We control characters with a method which avoids the rigging-skinning pipeline- source and target characters do not have skeletons or rigs. We use interactively-defined sparse pose correspondences to learn a mapping between arbitrary 3D point source sequences and mesh target sequences. Then, we puppet the target character in real time. We demonstrate the versatility of our method through results on diverse virtual characters with different input motion controllers. Our method provides a fast, flexible, and intuitive interface for arbitrary motion mapping which provides new ways to control characters for real-time animation.
机译:现在,可以在消费类硬件上捕获人体的3D运动,并可以实时伪造基于骨骼的虚拟角色。但是,许多字符没有人形骨骼。蜘蛛和毛毛虫等角色根本没有骨骼,这些角色的形状和动作也大不相同。通常,无法解决任意形状和运动转换下的角色控制-如何映射这些运动?我们使用一种避免操纵皮肤管线的方法来控制角色,源角色和目标角色没有骨架或装备。我们使用交互式定义的稀疏姿势对应关系来学习任意3D点源序列和网格目标序列之间的映射。然后,我们实时伪造目标角色。我们通过使用不同的输入运动控制器的各种虚拟角色的结果证明了我们方法的多功能性。我们的方法为任意运动映射提供了一种快速,灵活和直观的界面,这为控制实时动画的角色提供了新的方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号