...
首页> 外文期刊>Statistics and computing >Properties of the stochastic approximation EM algorithm with mini-batch sampling
【24h】

Properties of the stochastic approximation EM algorithm with mini-batch sampling

机译:迷你批量采样随机近似EM算法的特性

获取原文
获取原文并翻译 | 示例
           

摘要

To deal with very large datasets a mini-batch version of the Monte Carlo Markov Chain Stochastic Approximation Expectation-Maximization algorithm for general latent variable models is proposed. For exponential models the algorithm is shown to be convergent under classical conditions as the number of iterations increases. Numerical experiments illustrate the performance of the mini-batch algorithm in various models. In particular, we highlight that mini-batch sampling results in an important speed-up of the convergence of the sequence of estimators generated by the algorithm. Moreover, insights on the effect of the mini-batch size on the limit distribution are presented. Finally, we illustrate how to use mini-batch sampling in practice to improve results when a constraint on the computing time is given.
机译:为了处理非常大的数据集,提出了蒙特卡罗马尔可夫链随机逼近预期的最大批次版本的一般潜变量模型的最大化算法。对于指数模型,随着迭代的数量增加,算法显示在经典条件下会聚。数值实验说明了各种模型中的迷你批量算法的性能。特别是,我们突出显示迷你批量采样导致算法产生的估计序列的收敛的重要加速。此外,介绍了对迷你批量大小对极限分布效果的见解。最后,我们说明了如何在实践中使用Mini-Batch采样来改进给出计算时间的约束时提高结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号