首页> 外文会议>International conference on human-computer interaction;International conference on augmented cognition >Behaviour Adaptation Using Interaction Patterns with Augmented Reality Elements
【24h】

Behaviour Adaptation Using Interaction Patterns with Augmented Reality Elements

机译:使用具有增强现实元素的交互模式进行行为适应

获取原文

摘要

This publication describes a systematic approach for behaviour adaptations of humans, based on interaction patterns as a fundamental way to design and describe human machine interaction, and on image schemas as the basic elements of the resulting interaction. The natural learning path since childhood involves getting knowledge by experience; it is during this process that image schemas are built. The approach described in this paper was developed in close interplay with the concepts of cooperative guidance and control (CGC), where a cooperative automation and a human control a machine together, and of augmented reality (AR), where a natural representation of the world, e.g. in form of a video stream, is enriched with dynamic symbology. The concept was instantiated as interaction patterns "longitudinal and lateral collision avoidance", implemented in a fix based simulator, and tested with professional operators whether driving performance and safety in a vehicle with restricted vision could be improved. Furthermore, it was tested whether interaction patterns could be used to adapt the current driver behaviour towards better performance while reducing the task load. Using interaction patterns that escalated according to the drivers actions and the current environmental state, lead to a reduction of temporal demand, effort and frustration. Furthermore less collisions were counted and the overall lateral displacement of the vehicle was reduced. The results were a good mix of encouragement and lessons learned, both for the methodical approach of pattern based human machine interaction, and for the application of AR-based cooperative guidance and control.
机译:该出版物描述了一种用于人类行为适应的系统方法,该方法基于交互模式作为设计和描述人机交互的基本方式,并基于图像模式作为最终交互的基本元素。从儿童时代开始的自然学习路径就是通过经验获取知识。正是在此过程中构建了图像模式。本文中描述的方法是与合作制导和控制(CGC)的概念紧密协作而开发的,在合作制导和控制(CGC)中,协同自动化和人为控制机器,而在增强现实(AR)中,自然是世界的代表,例如以视频流的形式丰富了动态符号系统。该概念被实例化为“避免纵向和横向碰撞”的交互模式,在基于修复的模拟器中实现,并与专业操作人员进行了测试,以确定是否可以改善视力受限车辆的驾驶性能和安全性。此外,还测试了是否可以使用交互模式来使当前的驾驶员行为适应更好的性能,同时减少任务负荷。使用根据驾驶员的行为和当前环境状况逐步升级的交互模式,可以减少时间需求,精力和挫败感。此外,减少了碰撞次数,减少了车辆的整体侧向位移。结果既是基于模式的人机交互的方法论方法,又是基于AR的协作指导和控制的应用的鼓励和经验教训的良好结合。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号