【24h】

Improving Question Generation With to the Point Context

机译:通过点语境改善问题的产生

获取原文

摘要

Question generation (QG) is the task of generating a question from a reference sentence and a specified answer within the sentence. A major challenge in QG is to identify answer-relevant context words to finish the declarative-to-interrogative sentence transformation. Existing sequence-to-sequence neural models achieve this goal by proximity-based answer position encoding under the intuition that neighboring words of answers are of high possibility to be answer-relevant. However, such intuition may not apply to all cases especially for sentences with complex answer-relevant relations. Consequently, the performance of these models drops sharply when the relative distance between the answer fragment and other non-stop sentence words that also appear in the ground truth question increases. To address this issue, we propose a method to jointly model the unstructured sentence and the structured answer-relevant relation (extracted from the sentence in advance) for question generation. Specifically, the structured answer-relevant relation acts as the to the point context and it thus naturally helps keep the generated question to the point, while the unstructured sentence provides the full information. Extensive experiments show that to the point context helps our question generation model achieve significant improvements on several automatic evaluation metrics. Furthermore, our model is capable of generating diverse questions for a sentence which conveys multiple relations of its answer fragment.
机译:问题生成(QG)是根据参考句子和句子中指定的答案生成问题的任务。 QG中的主要挑战是识别与答案相关的上下文单词以完成从陈述到疑问句的转换。现有的序列到序列神经模型通过基于相邻的答案词很有可能与答案相关的直觉来通过基于接近度的答案位置编码来实现该目标。但是,这种直觉可能并不适用于所有情况,尤其是对于具有复杂的答案相关关系的句子。因此,当答案片段和也出现在地面真相问题中的其他不间断句子单词之间的相对距离增加时,这些模型的性能会急剧下降。为了解决这个问题,我们提出了一种方法,用于对非结构化句子和结构化答案相关关系(预先从句子中提取)进行联合建模以生成问题。具体而言,结构化的答案相关关系充当指向语境的关系,因此自然地有助于将生成的问题保持在该点上,而非结构化的句子则提供了完整的信息。大量的实验表明,上下文可以帮助我们的问题生成模型在几个自动评估指标上取得重大改进。此外,我们的模型能够为一个句子生成各种问题,从而传达其答案片段的多种关系。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号