首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Training Variational Autoencoders with Buffered Stochastic Variational Inference
【24h】

Training Variational Autoencoders with Buffered Stochastic Variational Inference

机译:具有缓冲随机变分的训练变形自动化器

获取原文
           

摘要

The recognition network in deep latent variable models such as variational autoencoders (VAEs) relies on amortized inference for efficient posterior approximation that can scale up to large datasets. However, this technique has also been demonstrated to select suboptimal variational parameters, often resulting in considerable additional error called the amortization gap. To close the amortization gap and improve the training of the generative model, recent works have introduced an additional refinement step that applies stochastic variational inference (SVI) to improve upon the variational parameters returned by the amortized inference model. In this paper, we propose the Buffered Stochastic Variational Inference (BSVI), a new refinement procedure that makes use of SVI’s sequence of intermediate variational proposal distributions and their corresponding importance weights to construct a new generalized importance-weighted lower bound. We demonstrate empirically that training the variational autoencoders with BSVI consistently out-performs SVI, yielding an improved training procedure for VAEs.
机译:诸如变形AutiaceCoders(VAES)之类的深度潜在变量模型中的识别网络依赖于可以扩展到大型数据集的有效后近似的摊销推理。然而,也已经证明了该技术以选择次优变分数,通常导致称为摊销间隙的相当大的额外误差。为了关闭摊销间隙并改善生成模型的培训,最近的作用引入了一种额外的细化步骤,该步骤应用随机变分推理(SVI)来改善摊销推理模型返回的变分参数。在本文中,我们提出了缓冲的随机变分推理(BSVI),一种新的细化程序,它利用SVI的中间变分的提案分布和它们对应的重量来构建新的广义重要性加权下限。我们凭经验证明,使用BSVI培训变形自身偏析器,始终如一地出现SVI,从而产生改进的VAE训练程序。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号