【24h】

On the Posterior Distribution of HMMs for a Long Sequence

机译:On the Posterior Distribution of HMMs for a Long Sequence

获取原文
获取原文并翻译 | 示例
       

摘要

Hidden Markov models (HMMs) are widely applied to analysis of time-dependent data sequences, such as non-linear signal processing, natural language processing, and bioinformatics. There are two possible formats of the training data in HMMs; One is a set of many time-dependent sequential data, and the other is an infinitely long sequence. The learning process is one of the main concerns in machine learning. For the former case, there is a method to reveal the generalization ability based on algebraic geometry. However, there is no theoretical analysis on the latter case. To construct its foundation, this paper reports some unique properties of the likelihood function in an experimental manner, and explains them in a theoretical manner. The results show that the likelihood function implicitly includes a local maxima factor, which can make the learning process slow, and that this slow learning realizes high performance in a stationary state evaluation.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号