首页> 外文会议>Human Factors and Ergonomics Society annual meeting >Integrating Head Pose Tracking with Eye Tracking for Wider FOV Virtual Environment
【24h】

Integrating Head Pose Tracking with Eye Tracking for Wider FOV Virtual Environment

机译:用眼睛跟踪集成头部姿势跟踪更广泛的FOV虚拟环境

获取原文

摘要

In virtual environment based driving experiments, eye-tracking is very helpful for quantitatively investigating driver visual behavior. However, commercial eye trackers usually have a limited maximum tracking range, e.g. ± 35degrees in horizontal direction for Tobii x50. In high fidelity applications with 135 degrees view, or even 360 degrees curved screen, there is a need to extend the range of eye tracking. This paper proposed a method to combine head pose tracking and eye tracking to achieve a large range of sightline tracking in wide field of view applications. Head poses (yaw and pitch) were estimated from head images with Multilayer Perceptrons (MLP). Head images were represented with a combination of coefficients obtained from Principal Component Analysis (PCA). With a 7×6 calibration grid, head pose estimation achieved the mean errors of yaw angle at.-2.1° and pitch angle at -6.5°. The standard deviation was about 10°. The estimation process is feasible for real-time wide field of view virtual environment applications.
机译:在基于虚拟环境的驾驶实验中,眼跟踪非常有助于定量调查驾驶员视觉行为。然而,商业眼跟踪器通常具有有限的最大跟踪范围,例如, Tobii X50的水平方向±35度。在具有135度视图的高保真应用中,甚至360度曲线屏幕,需要延长眼睛跟踪范围。本文提出了一种结合头部姿势跟踪和眼睛跟踪的方法,以实现广泛的视野应用中的大量视线跟踪。头部姿势(偏航和俯仰)从具有多层感知(MLP)的头像估计。用来自主成分分析(PCA)获得的系数的组合表示头部图像。通过7×6校准网格,头部姿势估计达到了-2.1°的横摆角的平均误差和-6.5°。标准偏差约为10°。估计过程对于实时视野虚拟环境应用程序是可行的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号