首页> 外文会议>International conference on computational linguistics >Word-Level Loss Extensions for Neural Temporal Relation Classification
【24h】

Word-Level Loss Extensions for Neural Temporal Relation Classification

机译:神经时间关系分类的字级丢失扩展

获取原文
获取外文期刊封面目录资料

摘要

Unsupervised pre-trained word embeddings are used effectively for many tasks in natural language processing to leverage unlabeled textual data. Often these embeddings are either used as initializations or as fixed word representations for task-specific classification models. In this work, we extend our classification model's task loss with an unsupervised auxiliary loss on the word-embedding level of the model. This is to ensure that the learned word representations contain both task-specific features, learned from the supervised loss component, and more general features learned from the unsupervised loss component. We evaluate our approach on the task of temporal relation extraction, in particular, narrative containment relation extraction from clinical records, and show that continued training of the embeddings on the unsupervised objective together with the task objective gives better task-specific embeddings, and results in an improvement over the state of the art on the THYME dataset, using only a general-domain part-of-speech tagger as linguistic resource.
机译:无监督的预先训练的单词嵌入式用于自然语言处理中的许多任务,以利用未标记的文本数据。这些嵌入式通常用作初始化或作为特定任务分类模型的固定字表示。在这项工作中,我们将分类模型的任务丢失扩展到模型的嵌入水平上的无监督辅助损失。这是为了确保学习的单词表示包含特定于任务特定功能,从监督丢失组件中学到,以及从无监督丢失组件中学到的更多一般功能。我们评估了我们对临床记录的叙事遏制关系提取的临床记录的任务的方法,并表明,与任务目标的无监督目标继续培训嵌入的嵌入式,给出了更好的特定任务嵌入,并导致了成果在百里香数据集上,仅使用通用域部分 - 语音标记器作为语言资源的改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号