首页> 外文期刊>Image and Vision Computing >Real-time facial action unit intensity prediction with regularized metric learning
【24h】

Real-time facial action unit intensity prediction with regularized metric learning

机译:带有规则化学习的实时面部动作单元强度预测

获取原文
获取原文并翻译 | 示例

摘要

The ability to automatically infer emotional states, engagement, depression or pain from nonverbal behavior has recently become of great interest in many research and industrial works. This will result in the emergence of a wide range of applications in robotics, biometrics, marketing and medicine. The Facial Action Coding System (FACS) proposed by Ekman features objective descriptions of facial movements, characterizing activations of facial muscles. Achieving an accurate intensity prediction of Action Units (AUs) has a significant impact on the prediction quality of more high-level information regarding human behavior (e.g. emotional states). Real-time AU intensity prediction, in many image-related machine learning tasks, is a high-dimensional problem. For solving this task, we propose adapting the Metric Learning for Kernel Regression (MLKR) framework focusing on overfitting issues induced in high-dimensional spaces. MLKR aims at estimating the optimal linear subspace for reducing the squared error of a Gaussian kernel regressor. We introduce Iterative Regularized Kernel Regression (IRKR), an iterative nonlinear feature selection method combined with a Lasso-regularized version of the original MLKR formulation that improves on the state-of-the-art results on several AU databases, ranging from prototypical to natural and wild data. (C) 2016 Elsevier B.V. All rights reserved.
机译:从非语言行为自动推断情绪状态,订婚,沮丧或痛苦的能力最近在许多研究和工业工作中引起了极大的兴趣。这将导致在机器人技术,生物识别技术,市场营销和医学领域出现广泛的应用。由埃克曼(Ekman)提出的面部动作编码系统(FACS)具有面部运动的客观描述,表征了面部肌肉的激活。实现对动作单位(AUs)的准确强度预测对有关人类行为(例如情绪状态)的更高级信息的预测质量具有重大影响。在许多图像相关的机器学习任务中,实时AU强度预测是一个高维问题。为解决此任务,我们建议对内核回归的度量学习(MLKR)框架进行调整,重点是在高维空间中引起的过拟合问题。 MLKR的目的是估计最佳线性子空间,以减少高斯核回归器的平方误差。我们介绍了迭代正则化内核回归(IRKR),这是一种迭代非线性特征选择方法,与原始MLKR公式的套索正则化版本相结合,可改进从几个原型到自然的几个AU数据库的最新结果和野生数据。 (C)2016 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号