首页> 外文会议>2nd workshop on representation learing for NLP 2017 >Machine Comprehension by Text-to-Text Neural Question Generation
【24h】

Machine Comprehension by Text-to-Text Neural Question Generation

机译:文本到文本神经问题生成的机器理解

获取原文
获取原文并翻译 | 示例

摘要

We propose a recurrent neural model that generates natural-language questions from documents, conditioned on answers. We show how to train the model using a combination of supervised and reinforcement learning. After teacher forcing for standard maximum likelihood training, we fine-tune the model using policy gradient techniques to maximize several rewards that measure question quality. Most notably, one of these rewards is the performance of a question-answering system. We motivate question generation as a means to improve the performance of question answering systems. Our model is trained and evaluated on the recent question-answering dataset SQuAD.
机译:我们提出了一种循环神经模型,该模型可以根据答案生成来自文档的自然语言问题。我们展示了如何结合监督学习和强化学习来训练模型。在老师强迫进行标准的最大可能性训练之后,我们使用策略梯度技术对模型进行了微调,以最大化衡量问题质量的几种奖励。最值得注意的是,这些奖励之一就是问答系统的性能。我们鼓励问题产生,以提高问题回答系统的性能。我们的模型是在最近的问答数据集SQuAD上进行训练和评估的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号