...
首页> 外文期刊>International Journal of Continuing Engineering Education and Life-long Learning >Integrating parallel analysis modules to evaluate the meaning of answers to reading comprehension questions
【24h】

Integrating parallel analysis modules to evaluate the meaning of answers to reading comprehension questions

机译:集成并行分析模块以评估阅读理解问题的答案的含义

获取原文
获取原文并翻译 | 示例
           

摘要

Contextualiscd, meaning-based interaction in the foreign language is widely recognised as crucial for second language acquisition. Correspondingly, current exercises in foreign language teaching generally require students to manipulate both form and meaning. For intelligent language tutoring systems to support such activities, they thus must be able to evaluate the appropriateness of the meaning of a learner response for a given exercise. We discuss such a content-assessment approach, focusing on reading comprehension exercises. We pursue the idea that a range of simultaneously available representations at different levels of complexity and linguistic abstraction provide a good empirical basis for content assessment. We show how an annotation-based NLP architecture implementing this idea can be realised and that it successfully performs on a corpus of authentic learner answers to reading comprehension questions. To support comparison and sustainable development on content assessment, we also define a general exchange format for such exercise data.
机译:外语的基于上下文的,基于含义的交互被广泛认为是第二语言习得的关键。相应地,当前在外语教学中的练习通常要求学生操纵形式和含义。对于支持这种活动的智能语言辅导系统,它们必须能够评估给定练习的学习者回答含义的适当性。我们讨论这种内容评估方法,重点是阅读理解练习。我们追求这样的想法,即在复杂性和语言抽象性不同的水平上,一系列同时可用的表示形式为内容评估提供了良好的经验基础。我们展示了如何实现基于注释的基于NLP的体系结构,以及如何在对阅读理解问题的真实学习者答案的语料库上成功执行。为了支持内容评估方面的比较和可持续发展,我们还为此类练习数据定义了一种通用的交换格式。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号