首页> 外文期刊>Behavior Research Methods >Mode-of-disparities error correction of eye-tracking data
【24h】

Mode-of-disparities error correction of eye-tracking data

机译:眼动数据的视差模式误差校正

获取原文
获取原文并翻译 | 示例
           

摘要

In eye-tracking research, there is almost always a disparity between a person’s actual gaze location and the location recorded by the eye tracker. Disparities that are constant over time are systematic error. In this article, we propose an error correction method that can reliably reduce systematic error and restore fixations to their true locations. We show that the method is reliable when the visual objects in the experiment are arranged in an irregular manner—for example, when they are not on a grid in which all fixations can be shifted to adjacent locations using the same directional adjustment. The method first calculates the disparities between fixations and their nearest objects. It then uses the annealed mean shift algorithm to find the mode of the disparities. The mode is demonstrated to correctly capture the magnitude and direction of the systematic error so that it can be removed. This article presents the method, an extended demonstration, and a validation of the method’s efficacy.
机译:在眼动追踪研究中,人们的实际凝视位置与眼动追踪器记录的位置之间几乎总是存在差异。随时间推移而恒定的差异是系统误差。在本文中,我们提出了一种纠错方法,该方法可以可靠地减少系统错误并将定位还原到其真实位置。我们表明,当实验中的视觉对象以不规则的方式排列时(例如,当它们不在网格中时,使用相同的方向调整,所有注视都可以移动到相邻位置)时,该方法是可靠的。该方法首先计算注视点与其最近的物体之间的差异。然后,它使用退火的均值漂移算法来找到视差的模式。演示了该模式可以正确捕获系统误差的大小和方向,以便可以将其消除。本文介绍了该方法,并进行了扩展演示,并验证了该方法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号