首页> 外文会议> >Optimizing the number of states, training iterations and Gaussians in an HMM-based handwritten word recognizer
【24h】

Optimizing the number of states, training iterations and Gaussians in an HMM-based handwritten word recognizer

机译:在基于HMM的手写单词识别器中优化状态数,训练迭代次数和高斯

获取原文

摘要

In off-line handwriting recognition, classifiers based on hidden Markov models (HMMs) have become very popular. However, while there exist well-established training algorithms, such as the Baum-Welsh procedure, which optimize the transition and output probabilities of a given HMM architecture, the architecture itself, and in particular the number of states, must be chosen "by hand". Also the number of training iterations and the output distributions need to be defined by the system designer. In this paper we examine some optimization strategies for an HMM classifier that works with continuous feature values and uses the Baum-Welch training algorithm. The free parameters of the optimization procedure introduced in this paper are the number of states of a model, the number of training iterations, and the number of Gaussian mixtures for each state. The proposed optimization strategies are evaluated in the context of a handwritten word recognition task.
机译:在离线手写识别中,基于隐马尔可夫模型(HMMS)的分类器变得非常受欢迎。然而,虽然存在既有良好的培训算法,例如BAUM-Welsh过程,其优化给定嗯架构的转换和输出概率,架构本身,特别是用手的状态。 “。此外,培训迭代的数量也需要由系统设计者定义。在本文中,我们研究了一种与连续特征值合作的HMM分类器的一些优化策略,并使用BAUM-Welch训练算法。本文介绍的优化过程的自由参数是模型的状态,训练迭代的数量以及每个州的高斯混合的数量。在手写词识别任务的上下文中评估了所提出的优化策略。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号