【24h】

Discourse-Aware Semantic Self-Attention for Narrative Reading Comprehension

机译:叙述性阅读理解的语篇感知语义自我注意

获取原文

摘要

In this work, we propose to use linguistic annotations as a basis for a Discourse-Aware Semantic Self-Attention encoder that we employ for reading comprehension on narrative texts. We extract relations between discourse units, events and their arguments as well as core-ferring mentions, using available annotation tools. Our empirical evaluation shows that the investigated structures improve the overall performance (up to +3.4 Rouge-L), especially intra-sentential and cross-sentential discourse relations, sentence-internal semantic role relations, and long-distance coreference relations. We show that dedicating self-attention heads to intra-sentential relations and relations connecting neighboring sentences is beneficial for finding answers to questions in longer contexts. Our findings encourage the use of discourse-semantic annotations to enhance the generalization capacity of self-attention models for reading comprehension.
机译:在这项工作中,我们建议使用语言注释作为语篇感知语义自注意编码器的基础,该编码器用于阅读对叙述文本的理解。我们使用可用的注释工具提取话语单元,事件及其论据以及核心提示之间的关系。我们的经验评估表明,所研究的结构改善了整体性能(高达+3.4 Rouge-L),尤其是句子内和跨句话语关系,句子内部语义角色关系以及长距离共指关系。我们表明,将自我注意力集中到句内关系和连接相邻句子的关系上有助于在更长的上下文中找到问题的答案。我们的发现鼓励使用话语语义注解来增强自我注意模型对阅读理解的泛化能力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号