首页> 外文会议>International joint conference on natural language processing;Conference on empirical methods in natural language processing >Improved Word Sense Disambiguation Using Pre-Trained Contextualized Word Representations
【24h】

Improved Word Sense Disambiguation Using Pre-Trained Contextualized Word Representations

机译:使用预训练的上下文化词表示法改善词义消歧

获取原文

摘要

Contextualized word representations are able to give different representations for the same word in different contexts, and they have been shown to be effective in downstream natural language processing tasks, such as question answering, named entity recognition, and sentiment analysis. However, evaluation on word sense disambiguation (WSD) in prior work shows that using contextualized word representations does not outperform the state-of-the-art approach that makes use of non-contextualized word embeddings. In this paper, we explore different strategies of integrating pre-trained contextualized word representations and our best strategy achieves accuracies exceeding the best prior published accuracies by significant margins on multiple benchmark WSD datasets.
机译:上下文化的词表示法能够在不同的上下文中为同一个词提供不同的表示法,并且已证明它们在下游自然语言处理任务(例如问题回答,命名实体识别和情感分析)中是有效的。但是,对先前工作中的词义歧义消除(WSD)的评估表明,使用上下文化的单词表示方法并不优于使用非上下文化的单词嵌入的最新方法。在本文中,我们探索了整合预训练的上下文化词语表示形式的不同策略,并且我们的最佳策略在多个基准WSD数据集上实现的精度超过了先前发布的最佳精度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号