首页> 外文会议>IEEE Conference on Computer Vision and Pattern Recognition >Learning Reconstruction-Based Remote Gaze Estimation
【24h】

Learning Reconstruction-Based Remote Gaze Estimation

机译:基于学习重构的远程注视估计

获取原文

摘要

It is a challenging problem to accurately estimate gazes from low-resolution eye images that do not provide fine and detailed features for eyes. Existing methods attempt to establish the mapping between the visual appearance space to the gaze space. Different from the direct regression approach, the reconstruction-based approach represents appearance and gaze via local linear reconstruction in their own spaces. A common treatment is to use the same local reconstruction in the two spaces, i.e., the reconstruction weights in the appearance space are transferred to the gaze space for gaze reconstruction. However, this questionable treatment is taken for granted but has never been justified, leading to significant errors in gaze estimation. This paper is focused on the study of this fundamental issue. It shows that the distance metric in the appearance space needs to be adjusted, before the same reconstruction can be used. A novel method is proposed to learn the metric, such that the affinity structure of the appearance space under this new metric is as close as possible to the affinity structure of the gaze space under the normal Euclidean metric. Furthermore, the local affinity structure invariance is utilized to further regularize the solution to the reconstruction weights, so as to obtain a more robust and accurate solution. Effectiveness of the proposed method is validated and demonstrated through extensive experiments on different subjects.
机译:从低分辨率的眼睛图像准确估计凝视是一个挑战性的问题,而这些图像无法为眼睛提供精细和详细的功能。现有方法试图建立视觉外观空间与凝视空间之间的映射。与直接回归方法不同,基于重建的方法通过自身空间中的局部线性重建来表示外观和凝视。一种常见的处理方法是在两个空间中使用相同的局部重建,即,将外观空间中的重建权重转移到凝视空间以进行凝视重建。但是,这种可疑的处理方法是理所当然的,但却从未得到过合理的处理,从而导致凝视估计出现重大错误。本文的重点是对这个基本问题的研究。它表明在可以使用相同的重构之前,需要调整外观空间中的距离度量。提出了一种新的方法来学习度量,以使得在该新度量下的外观空间的亲和力结构尽可能接近正常欧几里德度量下的凝视空间的亲和力结构。此外,利用局部亲和力结构不变性,进一步对重建权重进行正则化求解,以获得更鲁棒和准确的解。通过对不同主题进行广泛的实验,验证并证明了所提方法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号