首页> 外文OA文献 >A novel method for measuring gaze orientation in space in unrestrained head conditions
【2h】

A novel method for measuring gaze orientation in space in unrestrained head conditions

机译:一种在不受约束的头部条件下测量空间注视方向的新方法

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Investigation of eye movement strategies often requires the measurement of gaze orientation without restraining the head. However, most commercial eye-trackers have low tolerance for head movements. Here we present a novel geometry-based method to estimate gaze orientation in space in unrestricted head conditions. The method combines the measurement of eye-in-head orientation-provided by a head-mounted video-based eye-tracker-and head-in-space position and orientation-provided by a motion capture system. The method does not rely on specific assumptions on the configuration of the eye-tracker camera with respect to the eye and uses a central projection to estimate the pupil position from the camera image, thus improving upon previously proposed geometry-based procedures. The geometrical parameters for the mapping between pupil image and gaze orientation are derived with a calibration procedure based on nonlinear constrained optimization. Additionally, the method includes a procedure to correct for possible slippages of the tracker helmet based on a geometrical representation of the pupil-to-gaze mapping. We tested and validated our method on seven subjects in the context of a one-handed catching experiment. We obtained accuracy better than 0.8° and precision better than 0.5° in the measurement of gaze orientation. Our method can be used with any video-based eye-tracking system to investigate eye movement strategies in a broad range of naturalistic experimental scenarios.
机译:对眼睛运动策略的研究通常需要在不束缚头部的情况下测量注视方向。然而,大多数商业眼动仪对头部运动的耐受性较低。在这里,我们提出了一种新颖的基于几何的方法来估计不受限制的头部条件下空间中的凝视方向。该方法结合了由基于头戴式视频的眼动仪和头部在空间中所提供的头内方位的测量以及由运动捕捉系统所提供的方位的测量。该方法不依赖于眼动追踪器相机相对于眼睛的配置的特定假设,而是使用中央投影来从相机图像估计瞳孔位置,从而改进了先前提出的基于几何的过程。通过基于非线性约束优化的校准程序,得出用于瞳孔图像和注视方向之间映射的几何参数。另外,该方法包括基于瞳孔到注视映射的几何表示来校正跟踪器头盔的可能滑移的过程。在一个单手抓捕实验的背景下,我们对七个对象进行了测试和验证。在注视方向的测量中,我们获得了优于0.8°的精度和优于0.5°的精度。我们的方法可以与任何基于视频的眼动跟踪系统一起使用,以研究广泛的自然主义实验场景中的眼动策略。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号