首页> 外文会议>Conference of the European Chapter of the Association for Computational Linguistics >Exploring Different Dimensions of Attention for Uncertainty Detection
【24h】

Exploring Different Dimensions of Attention for Uncertainty Detection

机译:探索不确定性检测的不同尺寸

获取原文

摘要

Neural networks with attention have proven effective for many natural language processing tasks. In this paper, we develop attention mechanisms for uncertainty detection. In particular, we generalize standardly used attention mechanisms by introducing external attention and sequence-preserving attention. These novel architectures differ from standard approaches in that they use external resources to compute attention weights and preserve sequence information. We compare them to other configurations along different dimensions of attention. Our novel architectures set the new state of the art on a Wikipedia benchmark dataset and perform similar to the state-of-the-art model on a biomedical benchmark which uses a large set of linguistic features.
机译:对于许多自然语言处理任务有效,神经网络已经有效。在本文中,我们介绍了不确定性检测的注意机制。特别是,我们通过引入外部注意力和序列保留的关注来推广标准使用的注意机制。这些新颖架构与标准方法不同,因为它们使用外部资源来计算注意力和保留序列信息。我们将它们与其他尺寸的关注维度进行比较。我们的小说架构在维基百科基准数据集上设置了新的艺术状态,并在使用一大集语言学特征的生物医学基准测试中类似于最先进的模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号