首页> 外文会议>International Conference on Machine Learning >Stochastic Gradient MCMC Methods for Hidden Markov Models
【24h】

Stochastic Gradient MCMC Methods for Hidden Markov Models

机译:隐藏马尔可夫模型的随机梯度MCMC方法

获取原文

摘要

Stochastic gradient MCMC (SG-MCMC) algorithms have proven useful in scaling Bayesian inference to large datasets under an assumption of i.i.d data. We instead develop an SG-MCMC algorithm to learn the parameters of hidden Markov models (HMMs) for time-dependent data. There are two challenges to applying SG-MCMC in this setting: The latent discrete states, and needing to break dependencies when considering minibatches. We consider a marginal likelihood representation of the HMM and propose an algorithm that harnesses the inherent memory decay of the process. We demonstrate the effectiveness of our algorithm on synthetic experiments and an ion channel recording data, with runtimes significantly outperforming batch MCMC.
机译:随机梯度MCMC(SG-MCMC)算法已被证明在缩放贝叶斯推理到大型数据集的假设是在I.I.D数据的假设下。相反,我们开发了一个SG-MCMC算法,以了解隐藏马尔可夫模型(HMMS)的参数,用于时间相关的数据。在此设置中应用SG-MCMC有两个挑战:潜在的离散状态,并且需要在考虑迷你匹配时打破依赖性。我们考虑嗯的边际似然表示,并提出了一种利用该过程的固有内存衰减的算法。我们展示了算法对合成实验和离子通道记录数据的有效性,具有显着优于批量MCMC的运行时间。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号