【24h】

A Gated Self-attention Memory Network for Answer Selection

机译:门控自我关注记忆网络,用于答案选择

获取原文

摘要

Answer selection is an important research problem, with applications in many areas. Previous deep learning based approaches for the task mainly adopt the Compare-Aggregate architecture that performs word-level comparison followed by aggregation. In this work, we take a departure from the popular Compare-Aggregate architecture, and instead, propose a new gated self-attention memory network for the task. Combined with a simple transfer learning technique from a large-scale online corpus, our model outperforms previous methods by a large margin, achieving new state-of-the-art results on two standard answer selection datasets: TrecQA and WikiQA.
机译:答案的选择是一个重要的研究问题,在许多领域都有应用。先前基于深度学习的任务方法主要采用Compare-Aggregate体系结构,该体系结构执行词级比较,然后进行聚合。在这项工作中,我们偏离了流行的Compare-Aggregate体系结构,而是为任务提出了一个新的门控自我注意记忆网络。结合大规模在线语料库的简单迁移学习技术,我们的模型在很大程度上优于以前的方法,在两个标准答案选择数据集:TrecQA和WikiQA上获得了最新的技术成果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号