【24h】

Assessing Performance Competence in Training Games

机译:评估培训游戏的绩效能力

获取原文

摘要

In-process assessment of trainee learners in game-based simulators is a challenging activity. This typically involves human instructor time and cost, and does not scale to the one tutor per learner vision of computer-based learning. Moreover, evaluation from a human instructor is often subjective and comparisons between learners are not accurate. Therefore, in this paper, we propose an automated, formula-driven quantitative evaluation method for assessing performance competence in serious training games. Our proposed method has been empirically validated in a game-based driving simulator using 7 subjects and 13 sessions, and accuracy up to 90.25% has been achieved when compared to an existing qualitative method. We believe that by incorporating quantitative evaluation methods like these future training games could be enriched with more meaningful feedback and adaptive game-play so as to better monitor and support player motivation, engagement and learning performance.
机译:基于比赛的模拟器中实习生学习者的过程评估是一个具有挑战性的活动。这通常涉及人类的教练时间和成本,并且不会向计算机基础学习的每个学习者愿景的一个导师扩展。此外,来自人类教练的评估通常是主观的,学习者之间的比较不准确。因此,在本文中,我们提出了一种自动化的公式驱动的定量评估方法,用于评估严重培训游戏中的性能竞争力。我们所提出的方法在基于游戏的驾驶模拟器中经过经验验证,使用7个受试者和13个会话,与现有的定性方法相比,最高可达90.25%的准确性。我们认为,通过纳入这些未来的培训游戏等定量评估方法,可以丰富更有意义的反馈和自适应游戏,以便更好地监控和支持球员动机,参与和学习表现。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号