首页> 外文会议>International joint conference on natural language processing >A Gated Self-attention Memory Network for Answer Selection
【24h】

A Gated Self-attention Memory Network for Answer Selection

机译:用于答案选择的门控自我注意记忆网络

获取原文

摘要

Answer selection is an important research problem, with applications in many areas. Previous deep learning based approaches for the task mainly adopt the Compare-Aggregate architecture that performs word-level comparison followed by aggregation. In this work, we take a departure from the popular Compare-Aggregate architecture, and instead, propose a new gated self-attention memory network for the task. Combined with a simple transfer learning technique from a large-scale online corpus, our model outperforms previous methods by a large margin, achieving new state-of-the-art results on two standard answer selection datasets: TrecQA and WikiQA.
机译:回答选择是一个重要的研究问题,在许多领域的应用。以前的基于深度学习的任务方法主要采用比较 - 聚合体系结构,该架构执行Word级比较,然后进行聚合。在这项工作中,我们从流行的比较 - 聚合体系结构中出发,而是提出了一个新的门控自我注意记忆网络的任务。结合来自大型在线语料库的简单转移学习技术,我们的模型通过大幅度优于先前的方法,在两个标准答案选择数据集上实现新的最先进结果:TRECQA和WikiQA。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号