首页> 外文会议>AAAI Conference on Artificial Intelligence >Generating Distractors for Reading Comprehension Questions from Real Examinations
【24h】

Generating Distractors for Reading Comprehension Questions from Real Examinations

机译:从真正的考试中读取理解问题产生分散的人

获取原文

摘要

We investigate the task of distractor generation for multiple choice reading comprehension questions from examinations. In contrast to all previous works, we do not aim at preparing words or short phrases distractors, instead, we endeavor to generate longer and semantic-rich distractors which are closer to distractors in real reading comprehension from examinations. Taking a reading comprehension article, a pair of question and its correct option as input, our goal is to generate several distractors which are somehow related to the answer. consistent with the semantic context of the question and have some trace in the article. We propose a hierarchical encoder-decoder framework with static and dynamic attention mechanisms to tackle this task. Specifically, the dynamic attention can combine sentence-level and word-level attention varying at each recurrent time step to generate a more readable sequence. The static attention is to modulate the dynamic attention not to focus on question irrelevant sentences or sentences which contribute to the correct option. Our proposed framework outperforms several strong baselines on the first prepared distractor generation dataset of real reading comprehension questions. For human evaluation, compared with those distractors generated by baselines, our generated distractors are more functional to confuse the annotators.
机译:我们调查了对考试中多种选择阅读理解问题的分散注意力生成的任务。与以前的所有作品相比,我们不瞄准准备单词或短暂的短语分散的人,相反,我们努力产生更长和语义的富含性的分心者,这些干扰者更接近了考试的真实阅读理解中的分散体。参加阅读理解文章,一对问题及其正确的选择作为输入,我们的目标是产生几种与答案相关的若干令人满意的人。与问题的语义背景一致,文章中有一些迹象。我们提出了一个具有静态和动态注意机制的分层编码器解码器框架来解决此任务。具体地,动态注意力可以在每个反复时间步骤中结合句子级和字级注意改变以产生更可读的序列。静态关注是调制动态注意力,不要专注于有关正确选择的无关句子或句子的问题。我们所提出的框架优于第一个准备好的干扰者生成数据集的几个强大的基线,这是真实阅读理解问题的数据集。对于人类评估,与基线产生的那些分散组相比,我们所产生的干扰器更具功能,使注释器混淆。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号