首页> 外文会议>International joint conference on natural language processing >Discourse-Aware Semantic Self-Attention for Narrative Reading Comprehension
【24h】

Discourse-Aware Semantic Self-Attention for Narrative Reading Comprehension

机译:叙述阅读理解的话语意识语义自我关注

获取原文

摘要

In this work, we propose to use linguistic annotations as a basis for a Discourse-Aware Semantic Self-Attention encoder that we employ for reading comprehension on narrative texts. We extract relations between discourse units, events and their arguments as well as core-ferring mentions, using available annotation tools. Our empirical evaluation shows that the investigated structures improve the overall performance (up to +3.4 Rouge-L), especially intra-sentential and cross-sentential discourse relations, sentence-internal semantic role relations, and long-distance coreference relations. We show that dedicating self-attention heads to intra-sentential relations and relations connecting neighboring sentences is beneficial for finding answers to questions in longer contexts. Our findings encourage the use of discourse-semantic annotations to enhance the generalization capacity of self-attention models for reading comprehension.
机译:在这项工作中,我们建议使用语言注释作为话语意识语义自我关注编码器的基础,以便在叙述文本上阅读理解。我们使用可用的注释工具提取话语单位,事件及其参数以及核心安全性的关系。我们的经验评估表明,调查的结构改善了整体性能(高达+ 3.4 Rouge-L),特别是句子内和交叉信道的话语关系,句子内部语义角色关系和远程级级贯穿关系。我们表明,致力于连接邻近句子的句子关系和关系的自我关注头部有利于在更长的环境中找到对问题的答案。我们的调查结果鼓励使用话语 - 语义注释来提高自我关注模型的泛化能力,以便阅读理解。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号