首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering
【24h】

Multi-Granularity Hierarchical Attention Fusion Networks for Reading Comprehension and Question Answering

机译:用于阅读理解和问答的多粒度分层注意力融合网络

获取原文

摘要

This paper describes a novel hierarchical attention network for reading comprehension style question answering, which aims to answer questions for a given narrative paragraph. In the proposed method, attention and fusion are conducted horizontally and vertically across layers at different levels of granularity between question and paragraph. Specifically, it first encode the question and paragraph with fine-grained language embeddings, to better capture the respective representations at semantic level. Then it proposes a multi-granularity fusion approach to fully fuse information from both global and attended representations. Finally, it introduces a hierarchical attention network to focuses on the answer span progressively with multi-level soft-alignment. Extensive experiments on the large-scale SQuAD and TriviaQA datasets validate the effectiveness of the proposed method. At the time of writing the paper (Jan. 12th 2018), our model achieves the first position on the SQuAD leader-board for both single and ensemble models. We also achieves state-of-the-art results on TriviaQA, AddSent and AddOne-Sent datasets.
机译:本文介绍了一种新颖的分层注意力网络,用于阅读理解风格的问题回答,旨在回答给定叙述段落的问题。在提出的方法中,注意和融合是在问题和段落之间的不同粒度级别上,水平和垂直地跨层进行的。具体来说,它首先使用细粒度的语言嵌入对问题和段落进行编码,以更好地在语义级别捕获相应的表示形式。然后,它提出了一种多粒度融合方法,以完全融合来自全局表示和参与表示的信息。最后,它引入了一个分层的注意力网络,通过多级软对齐逐步关注答案范围。在大规模SQuAD和TriviaQA数据集上的大量实验验证了该方法的有效性。在撰写本文时(2018年1月12日),我们的模型在SQuAD排行榜上无论是单一模型还是整体模型都达到了第一位。我们还在TriviaQA,AddSent和AddOne-Sent数据集上获得了最新的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号