首页> 外文会议>2nd workshop on representation learing for NLP 2017 >Sequential Attention: A Context-Aware Alignment Function for Machine Reading
【24h】

Sequential Attention: A Context-Aware Alignment Function for Machine Reading

机译:顺序注意:机器阅读的上下文感知对齐功能

获取原文
获取原文并翻译 | 示例

摘要

In this paper we propose a neural network model with a novel Sequential Attention layer that extends soft attention by assigning weights to words in an input sequence in a way that takes into account not just how well that word matches a query, but how well surrounding words match. We evaluate this approach on the task of reading comprehension (on the Who did What and CAW datasets) and show that it dramatically improves a strong baseline—the Stanford Reader—and is competitive with the state of the art.
机译:在本文中,我们提出了一种具有新颖的顺序注意层的神经网络模型,该层通过将权重分配给输入序列中的单词来扩展软注意力,这种方式不仅考虑了单词与查询的匹配程度如何,还考虑了单词周围的匹配程度比赛。我们对阅读理解的任务(在Who who What和CAW数据集上)进行了评估,结果表明该方法显着提高了坚实的基线(斯坦福阅读器),并且与最新技术具有竞争优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号