首页> 外文会议>Law and Computing, 2004 >Lip-sync in human face animation based on video analysis and spline models
【24h】

Lip-sync in human face animation based on video analysis and spline models

机译:基于视频分析和样条模型的人脸动画中的口型同步

获取原文
获取原文并翻译 | 示例

摘要

Human facial animation is an interesting and difficult problem in computer graphics. In this paper, a novel B-spline (NURBS) muscle system is proposed to simulate a 3D facial expression and talking animation. The system gets the lip shape parameters from the video, which captures a real person's lip movement, to control the proper muscles to form different phonemes. The muscles are constructed by the non-uniform rational B-spline curves, which are based on anatomical knowledge. By using different number of control points on the muscles, more detailed facial expression and mouth shapes can be simulated. We demonstrate the flexibility of our model by simulating different emotions and lip-sync to a video with a talking head using the automatically extracted lip parameters.
机译:人脸动画是计算机图形学中一个有趣且困难的问题。在本文中,提出了一种新颖的B样条(NURBS)肌肉系统来模拟3D面部表情和说话动画。该系统从视频中获取嘴唇形状参数,该视频捕获了真实人的嘴唇运动,以控制适当的肌肉以形成不同的音素。肌肉由不均匀的有理B样条曲线构成,该曲线基于解剖学知识。通过在肌肉上使用不同数量的控制点,可以模拟更详细的面部表情和嘴巴形状。我们通过使用自动提取的嘴唇参数模拟不同的情绪并与说话人的视频进行口型同步来演示模型的灵活性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号