首页> 中文期刊> 《计算机科学技术学报:英文版》 >Optimizing Non-Decomposable Evaluation Metrics for NeuralMachine Translation

Optimizing Non-Decomposable Evaluation Metrics for NeuralMachine Translation

         

摘要

While optimizing model parameters with respect to evaluation metrics has recently proven to benefit end-to-end neural machine translation (NMT), the evaluation metrics used in the training are restricted to be defined at. thesentence level to facilitate online learning algorithms. This is undesirable because the final evaluation metrics used in tiletesting phase are usually non-decomposable (i.e., they are defined at the corpus level and cannot be expressed as the sumof sentence-level metrics). To minimize the discrepancy between the training and the testing, we propose to extend thenfinimum risk training (MRT) algorithm to take non-decomposable corpus-level evaluation metrics into consideration whilestill keeping; the advantages of online training. This can be done by calculating corpus-level evaluation metrics on a subsetof training data at each step in online training. Experiments on Chinese-English and English-French translation show thatour approach improves the correlation between training and testing and significantly outperforms the MRT algorithm usingdecomposable evaluation metrics.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号