首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Self-Attention Architectures for Answer-Agnostic Neural Question Generation
【24h】

Self-Attention Architectures for Answer-Agnostic Neural Question Generation

机译:自我应答架构,用于与答案无关的神经问题生成

获取原文

摘要

Neural architectures based on self-attention, such as Transformers, recently attracted interest from the research community, and obtained significant improvements over the state of the art in several tasks. We explore how Transformers can be adapted to the task of Neural Question Generation without constraining the model to focus on a specific answer passage. We study the effect of several strategies to deal with out-of-vocabulary words such as copy mechanisms, placeholders, and contextual word embeddings. We report improvements obtained over the state-of-the-art on the SQuAD dataset according to automated metrics (BLEU, ROUGE), as well as qualitative human assessments of the system outputs.
机译:最近,基于自我注意力的神经体系结构(例如“变形金刚”)引起了研究界的关注,并在一些任务上取得了较现有技术水平的显着改进。我们探索了如何在不限制模型专注于特定答案段落的情况下将“变形金刚”(Transformers)适应神经问题生成的任务。我们研究了处理词汇不足单词的几种策略的效果,例如复制机制,占位符和上下文单词嵌入。我们报告了根据自动化指标(BLEU,ROUGE)以及对系统输出的定性人工评估,在SQuAD数据集上获得的最新改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号