首页> 外文会议>Conference on machine translation;Annual meeting of the Association for Computational Linguistics >QE BERT: Bilingual BERT using Multi-task Learning for Neural Quality Estimation
【24h】

QE BERT: Bilingual BERT using Multi-task Learning for Neural Quality Estimation

机译:QE BERT:使用多任务学习进行神经质量估计的双语BERT

获取原文

摘要

For translation quality estimation at word and sentence levels, this paper presents a novel approach based on BERT that recently has achieved impressive results on various natural language processing tasks. Our proposed model is re-purposed BERT for the translation quality estimation and uses multi-task learning for the sentence-level task and word-level sub-tasks (i.e., source word, target word, and target gap). Experimental results on Quality Estimation shared task of WMT 19 show that our systems show competitive results and provide signilicant improvements over the baseline.
机译:对于单词和句子级别的翻译质量估计,本文提出了一种基于BERT的新颖方法,该方法最近在各种自然语言处理任务上取得了令人印象深刻的结果。我们提出的模型是将BERT重新用于翻译质量估计的,并将多任务学习用于句子级任务和单词级子任务(即源单词,目标单词和目标差距)。关于WMT 19的质量评估共享任务的实验结果表明,我们的系统显示出具有竞争力的结果,并且在基线之上提供了显着的改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号