首页> 外文会议>International Conference on Computational Linguistics >Random Restarts in Minimum Error Rate Training for Statistical Machine Translation
【24h】

Random Restarts in Minimum Error Rate Training for Statistical Machine Translation

机译:随机重启,以最小的统计机器翻译训练

获取原文

摘要

Och's (2003) minimum error rate training (MERT) procedure is the most commonly used method for training feature weights in statistical machine translation (SMT) models. The use of multiple randomized starting points in MERT is a well-established practice, although there seems to be no published systematic study of its benefits. We compare several ways of performing random restarts with MERT. We find that all of our random restart methods outperform MERT without random restarts, and we develop some refinements of random restarts that are superior to the most common approach with regard to resulting model quality and training time.
机译:OCH的(2003)最小错误率训练(MERT)程序是统计机器翻译(SMT)模型中训练特征权重的最常用方法。在默特中使用多个随机起点是一项良好的实践,尽管似乎没有公布的系统研究的效益。我们比较几种方法可以随机重新启动ermert。我们发现我们所有的随机重启方法都不会胜过默认,而无需随机重启,我们开发了一些随机重启的改进,这些方法优于最常见的方法,方面的模型质量和培训时间。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号