【24h】

Neural Autoregressive Flows

机译:神经自回归流

获取原文
           

摘要

Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF) (Papamakarios et al., 2017), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time (Oord et al., 2017), via Inverse Autoregressive Flows (IAF) (Kingma et al., 2016). We unify and generalize these approaches, replacing the (conditionally) affine univariate transformations of MAF/IAF with a more general class of invertible univariate transformations expressed as monotonic neural networks. We demonstrate that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions, and their greater expressivity allows them to better capture multimodal target distributions. Experimentally, NAF yields state-of-the-art performance on a suite of density estimation tasks and outperforms IAF in variational autoencoders trained on binarized MNIST.
机译:通过蒙版自回归流(MAF)(Papamakarios et al。,2017),已成功将归一化流和自回归模型成功地结合在一起,以产生密度估计的最新结果,并加速了最新的WaveNet通过逆自回归流(IAF)将基于语音的语音合成比实时快20倍(Oord等,2017)(Kingma等,2016)。我们统一并概括了这些方法,将MAF / IAF的(有条件的)仿射单变量转换替换为表示为单调神经网络的更通用的可逆单变量转换。我们证明,提出的神经自回归流(NAF)是连续概率分布的通用逼近器,它们的更高的表达力使它们可以更好地捕获多峰目标分布。在实验上,NAF在一系列密度估计任务上表现出最先进的性能,并且在二值化MNIST上训练的变分自动编码器中的性能优于IAF。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号