首页> 外文会议>Annual meeting of the Association for Computational Linguistics;Meeting of the Association for Computational Linguistics >Hierarchical Joint Learning: Improving Joint Parsing and Named Entity Recognition with Non-Jointly Labeled Data
【24h】

Hierarchical Joint Learning: Improving Joint Parsing and Named Entity Recognition with Non-Jointly Labeled Data

机译:分层联合学习:使用非联合标记的数据改进联合解析和命名实体识别

获取原文

摘要

One of the main obstacles to producing high quality joint models is the lack of jointly annotated data. Joint modeling of multiple natural language processing tasks outperforms single-task models learned from the same data, but still under-performs compared to single-task models learned on the more abundant quantities of available single-task annotated data. In this paper we present a novel model which makes use of additional single-task annotated data to improve the performance of a joint model. Our model utilizes a hierarchical prior to link the feature weights for shared features in several single-task models and the joint model. Experiments on joint parsing and named entity recognition, using the OntoNotes corpus, show that our hierarchical joint model can produce substantial gains over a joint model trained on only the jointly annotated data.
机译:产生高质量联合模型的主要障碍之一是缺乏联合注释的数据。多个自然语言处理任务的联合建模性能优于从相同数据学习的单任务模型,但与从大量可用单任务注释数据中学习的单任务模型相比,其性能仍然不佳。在本文中,我们提出了一种新颖的模型,该模型利用附加的单任务注释数据来提高联合模型的性能。我们的模型利用层次结构先于多个单任务模型和联合模型中为共享特征链接特征权重。使用OntoNotes语料库进行的联合解析和命名实体识别的实验表明,与仅基于联合注释数据训练的联合模型相比,我们的分层联合模型可以产生可观的收益。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号