首页> 外文会议>Conference of the European Chapter of the Association for Computational Linguistics >Exploring Different Dimensions of Attention for Uncertainty Detection
【24h】

Exploring Different Dimensions of Attention for Uncertainty Detection

机译:探索用于不确定性检测的注意的不同维度

获取原文

摘要

Neural networks with attention have proven effective for many natural language processing tasks. In this paper, we develop attention mechanisms for uncertainty detection. In particular, we generalize standardly used attention mechanisms by introducing external attention and sequence-preserving attention. These novel architectures differ from standard approaches in that they use external resources to compute attention weights and preserve sequence information. We compare them to other configurations along different dimensions of attention. Our novel architectures set the new state of the art on a Wikipedia benchmark dataset and perform similar to the state-of-the-art model on a biomedical benchmark which uses a large set of linguistic features.
机译:事实证明,关注神经网络对于许多自然语言处理任务是有效的。在本文中,我们开发了用于不确定性检测的注意力机制。特别是,我们通过引入外部注意力和保持序列的注意力来概括标准使用的注意力机制。这些新颖的体系结构与标准方法的不同之处在于,它们使用外部资源来计算注意力权重并保留序列信息。我们将它们与关注度不同的其他配置进行比较。我们新颖的架构在Wikipedia基准数据集上树立了新的技术水平,并与使用大量语言功能的生物医学基准上的最新模型相似。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号