首页> 外文会议>Second conference on machine translation >Unbabel's Participation in the WMT17 Translation Quality Estimation Shared Task
【24h】

Unbabel's Participation in the WMT17 Translation Quality Estimation Shared Task

机译:Unbabel参与WMT17翻译质量评估共同任务

获取原文
获取原文并翻译 | 示例

摘要

This paper presents the contribution of the Unbabel team to the WMT 2017 Shared Task on Translation Quality Estimation. We participated on the word-level and sentence-level tracks. We describe our two submitted systems: (1) StackedQE, a "pure" QE system, trained only on the provided training sets, which is a stacked combination of a feature-rich sequential linear model with a neural network, and (ii) FULLSTACKEDQE, which also stacks the predictions of an automatic post-editing system, trained on additional data. When evaluated on the English-German and German-English datasets, FULLSTACKEDQE achieved word-level F_1~(MULT) scores of 56.6% and 52 9%, and sentence-level correlation Pearson scores of 64.1% and 62.6%, respectively. Our system ranked second in both tracks, being statistically indistinguishable from the best system in the word-level track.
机译:本文介绍了Unbabel团队对2017年WMT翻译质量评估共同任务的贡献。我们参加了单词级和句子级的曲目。我们描述了我们提交的两个系统:(1)StackedQE,一种“纯” QE系统,仅在提供的训练集上进行训练,它是功能丰富的顺序线性模型与神经网络的堆叠组合,以及(ii)FULLSTACKEDQE ,它还堆叠了自动后期编辑系统的预测,并根据其他数据进行了训练。在英语-德语和德语-英语数据集上进行评估时,FULLSTACKEDQE的单词水平F_1〜(MULT)得分分别为56.6%和52 9%,句子水平相关的Pearson得分分别为64.1%和62.6%。我们的系统在两个音轨中均排名第二,与词级音轨中的最佳系统在统计上没有区别。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号