首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Non-parametric estimation of Jensen-Shannon Divergence in Generative Adversarial Network training
【24h】

Non-parametric estimation of Jensen-Shannon Divergence in Generative Adversarial Network training

机译:生成对抗网络训练中詹森-香农散度的非参数估计

获取原文
       

摘要

Generative Adversarial Networks (GANs) have become a widely popular framework for generative modelling of high-dimensional datasets. However their training is well-known to be difficult. This work presents a rigorous statistical analysis of GANs providing straight-forward explanations for common training pathologies such as vanishing gradients. Furthermore, it proposes a new training objective, Kernel GANs and demonstrates its practical effectiveness on large-scale real-world data sets. A key element in the analysis is the distinction between training with respect to the (unknown) data distribution, and its empirical counterpart. To overcome issues in GAN training, we pursue the idea of smoothing the Jensen-Shannon Divergence (JSD) by incorporating noise in the input distributions of the discriminator. As we show, this effectively leads to an empirical version of the JSD in which the true and the generator densities are replaced by kernel density estimates, which leads to Kernel GANs.
机译:生成对抗网络(GANs)已成为用于高维数据集生成建模的广泛流行的框架。但是,众所周知,他们的训练很困难。这项工作对GAN进行了严格的统计分析,为常见的训练病理(例如消失的梯度)提供了直接的解释。此外,它提出了一个新的培训目标Kernel GAN,并证明了其在大规模实际数据集上的实际有效性。分析中的一个关键要素是区分(未知)数据分布的训练和经验的训练。为了克服GAN训练中的问题,我们追求通过在鉴别器的输入分布中加入噪声来平滑Jensen-Shannon发散(JSD)的想法。正如我们所展示的,这有效地导致了JSD的经验版本,其中真实密度和生成器密度被内核密度估计所取代,这导致了内核GAN。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号