...
首页> 外文期刊>Pattern Recognition: The Journal of the Pattern Recognition Society >Position-aware self-attention based neural sequence labeling
【24h】

Position-aware self-attention based neural sequence labeling

机译:基于位置感知的自我关注神经序列标记

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Sequence labeling is a fundamental task in natural language processing and has been widely studied. Recently, RNN-based sequence labeling models have increasingly gained attentions. Despite superior performance achieved by learning the long short-term (i.e., successive) dependencies, the way of sequentially processing inputs might limit the ability to capture the non-continuous relations over tokens within a sentence. To tackle the problem, we focus on how to effectively model successive and discrete dependencies of each token for enhancing the sequence labeling performance. Specifically, we propose an innovative attention-based model (called position-aware self-attention, i.e., PSA) as well as a well-designed self-attentional context fusion layer within a neural network architecture, to explore the positional information of an input sequence for capturing the latent relations among tokens. Extensive experiments on three classical tasks in sequence labeling domain, i.e., part-of-speech (POS ) tagging, named entity recognition (NER) and phrase chunking, demonstrate our proposed model outperforms the state-of-the-arts without any external knowledge, in terms of various metrics. (c) 2020 Elsevier Ltd. All rights reserved.
机译:序列标记是自然语言处理中的一项基本任务,已被广泛研究。近年来,基于RNN的序列标记模型越来越受到人们的关注。尽管通过学习长短期(即连续的)依赖关系可以获得优异的性能,但顺序处理输入的方式可能会限制捕获句子中标记的非连续关系的能力。为了解决这个问题,我们关注如何有效地建模每个标记的连续和离散依赖性,以提高序列标记性能。具体来说,我们提出了一个创新的基于注意的模型(称为位置感知自我注意,即PSA)以及一个设计良好的神经网络架构内的自我注意上下文融合层,以探索输入序列的位置信息,从而捕捉标记之间的潜在关系。对序列标记领域的三个经典任务,即词性标记、命名实体识别和短语组块,进行了大量实验,结果表明,我们提出的模型在各种指标上优于没有任何外部知识的现有模型。(c) 2020爱思唯尔有限公司版权所有。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号