...
首页> 外文期刊>Pattern recognition letters >A new look at discriminative training for hidden Markov models
【24h】

A new look at discriminative training for hidden Markov models

机译:隐马尔可夫模型的判别训练新视角

获取原文
获取原文并翻译 | 示例
           

摘要

Discriminative training for hidden Markov models (HMMs) has been a central theme in speech recognition research for many years. One most popular technique is minimum classification error (MCE) training, with the objective function closely related to the empirical error rate and with the optimization method based traditionally on gradient descent. In this paper, we provide a new look at the MCE technique in two ways. First, we develop a non-trivial framework in which the MCE objective function is re-formulated as a rational function for multiple sentence-level training tokens. Second, using this novel re-formulation, we develop a new optimization method for discriminatively estimating HMM parameters based on growth transformation or extended Baum-Welch algorithm. Technical details are given for the use of lattices as a rich representation of competing candidates for the MCE training.
机译:多年来,隐马尔可夫模型(HMM)的判别训练一直是语音识别研究的中心主题。一种最流行的技术是最小分类误差(MCE)训练,其目标函数与经验误差率密切相关,并且采用传统上基于梯度下降的优化方法。在本文中,我们以两种方式对MCE技术进行了重新审视。首先,我们开发了一个非平凡的框架,其中MCE目标函数被重新构造为多个句子级训练令牌的有理函数。其次,使用这种新颖的重新公式,我们开发了一种新的优化方法,用于基于增长变换或扩展的Baum-Welch算法来有区别地估计HMM参数。给出了使用格网作为MCE培训竞争候选人的丰富表示的技术细节。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号