首页> 外文会议>22nd International Conference on Computational Linguistics >Random Restarts in Minimum Error Rate Training for Statistical Machine Translation
【24h】

Random Restarts in Minimum Error Rate Training for Statistical Machine Translation

机译:统计机器翻译的最小错误率训练中的随机重启

获取原文
获取原文并翻译 | 示例

摘要

Och's (2003) minimum error rate training (MERT) procedure is the most commonly used method for training feature weights in statistical machine translation (SMT) models. The use of multiple randomized starting points in MERT is a well-established practice, although there seems to be no published systematic study of its benefits. We compare several ways of performing random restarts with MERT. We find that all of our random restart methods outperform MERT without random restarts, and we develop some refinements of random restarts that are superior to the most common approach with regard to resulting model quality and training time.
机译:Och(2003)的最小错误率训练(MERT)程序是在统计机器翻译(SMT)模型中训练特征权重的最常用方法。尽管似乎尚未发表有关其益处的系统研究,但在MERT中使用多个随机起点是一种公认​​的做法。我们比较了使用MERT执行随机重启的几种方法。我们发现,在没有随机重启的情况下,我们所有的随机重启方法都优于MERT,并且我们对随机重启进行了一些改进,就生成的模型质量和训练时间而言,这些改进优于最常用的方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号