首页> 外文期刊>International Journal of Artificial Intelligence Tools: Architectures, Languages, Algorithms >A Nonstationary Hidden Markov Model with Approximately Infinitely-Long Time-Dependencies
【24h】

A Nonstationary Hidden Markov Model with Approximately Infinitely-Long Time-Dependencies

机译:具有近似无限长时间依赖性的非平稳隐马尔可夫模型

获取原文
获取原文并翻译 | 示例
           

摘要

Hidden Markov models (HMMs) are a popular approach for modeling sequential data, typically based on the assumption of a first-order Markov chain. In other words, only one-step back dependencies are modeled which is a rather unrealistic assumption in most applications. In this paper, we propose a method for postulating HMMs with approximately infinitely-long time-dependencies. Our approach considers the whole history of model states in the postulated dependencies, by making use of a recently proposed nonparametric Bayesian method for modeling label sequences with infinitely-long time dependencies, namely the sequence memoizer. We manage to derive training and inference algorithms for our model with computational costs identical to simple first-order HMMs, despite its entailed infinitely-long time-dependencies, by employing a mean-field-like approximation. The efficacy of our proposed model is experimentally demonstrated.
机译:隐马尔可夫模型(HMM)是一种流行的方法,通常基于一阶马尔可夫链的假设对顺序数据进行建模。换句话说,仅对单步后退依赖项进行建模,这在大多数应用程序中是相当不现实的假设。在本文中,我们提出了一种用于假设HMM的方法,该方法具有近似无限长的时间依赖性。我们的方法通过使用最近提出的非参数贝叶斯方法来建模具有无限长时间依赖性的标签序列,从而考虑了假定依赖性中模型状态的整个历史,即序列记忆器。我们设法通过采用类似均值场的近似方法,为我们的模型导出训练和推理算法,其计算成本与简单的一阶HMM相同,尽管它需要无限长的时间依赖性。我们提出的模型的功效已通过实验证明。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号