首页> 外文会议>International conference on computational linguistics >Attending to Characters in Neural Sequence Labeling Models
【24h】

Attending to Characters in Neural Sequence Labeling Models

机译:参加神经序列标记模型中的字符

获取原文

摘要

Sequence labeling architectures use word embeddings for capturing similarity, but suffer when handling previously unseen or rare words. We investigate character-level extensions to such models and propose a novel architecture for combining alternative word representations. By using an attention mechanism, the model is able to dynamically decide how much information to use from a word- or character-level component. We evaluated different architectures on a range of sequence labeling datasets, and character-level extensions were found to improve performance on every benchmark. In addition, the proposed attention-based architecture delivered the best results even with a smaller number of trainable parameters.
机译:序列标记体系结构使用单词嵌入来捕获相似性,但是在处理以前看不见或稀有的单词时会遇到麻烦。我们研究了此类模型的字符级扩展,并提出了一种新颖的体系结构,用于组合替代单词表示形式。通过使用注意力机制,该模型能够动态地决定要从单词或字符级组件中使用多少信息。我们在一系列序列标签数据集上评估了不同的体系结构,并且发现了字符级扩展可以提高每个基准的性能。此外,即使使用较少的可训练参数,建议的基于注意力的体系结构也可以提供最佳结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号