【24h】

ExpNet: Landmark-Free, Deep, 3D Facial Expressions

机译:ExpNet:无地标,深度,3D面部表情

获取原文
获取原文并翻译 | 示例

摘要

We describe a deep learning based method for estimating 3D facial expression coefficients. Unlike previous work, our process does not relay on facial landmark detection methods as a proxy step. Recent methods have shown that a CNN can be trained to regress accurate and discriminative 3D morphable model (3DMM) representations, directly from image intensities. By foregoing landmark detection, these methods were able to estimate shapes for occluded faces appearing in unprecedented viewing conditions. We build on those methods by showing that facial expressions can also be estimated by a robust, deep, landmark-free approach. Our ExpNet CNN is applied directly to the intensities of a face image and regresses a 29D vector of 3D expression coefficients. We propose a unique method for collecting data to train our network, leveraging on the robustness of deep networks to training label noise. We further offer a novel means of evaluating the accuracy of estimated expression coefficients: by measuring how well they capture facial emotions on the CK+ and EmotiW-17 emotion recognition benchmarks. We show that our ExpNet produces expression coefficients which better discriminate between facial emotions than those obtained using state of the art, facial landmark detectors. Moreover, this advantage grows as image scales drop, demonstrating that our ExpNet is more robust to scale changes than landmark detectors. Finally, our ExpNet is orders of magnitude faster than its alternatives.
机译:我们描述了一种用于估计3D面部表情系数的基于深度学习的方法。与以前的工作不同,我们的过程不会将面部标志检测方法作为代理步骤。最近的方法表明,可以训练CNN直接从图像强度中回归出准确而有区别的3D可变形模型(3DMM)表示。通过前述的地标检测,这些方法能够估计在前所未有的观看条件下出现的被遮挡脸部的形状。我们通过显示那些面部表情也可以通过一种健壮,深入,无地标的方法来估计,以此为基础。我们的ExpNet CNN直接应用于人脸图像的强度,并回归3D表达系数的29D向量。我们提出一种独特的方法来收集数据以训练我们的网络,并利用深层网络的鲁棒性来训练标签噪声。我们进一步提供了一种评估估计的表达系数准确性的新颖方法:通过测量它们在CK +和EmotiW-17情绪识别基准上捕获面部表情的程度如何。我们证明,与使用最新技术,面部标志检测器获得的表情系数相比,ExpNet产生的表情系数可以更好地区分面部表情。此外,这种优势随着图像比例的下降而增长,这表明我们的ExpNet比地标检测器对比例变化的鲁棒性更高。最后,我们的ExpNet比其替代产品快几个数量级。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号