首页> 外文会议>Workshop on neural generation and translation >On the Importance of the Kullback-Leibler Divergence Term in Variational Autoencoders for Text Generation
【24h】

On the Importance of the Kullback-Leibler Divergence Term in Variational Autoencoders for Text Generation

机译:关于变分自动编码器中Kullback-Leibler发散项在文本生成中的重要性

获取原文

摘要

Variational Autoencoders (VAEs) are known to suffer from learning uninformative latent representation of the input due to issues such as approximated posterior collapse, or entanglement of the latent space. We impose an explicit constraint on the Kullback-Leibler (KL) divergence term inside the VAE objective function. While the explicit constraint naturally avoids posterior collapse, we use it to further understand the significance of the KL term in controlling the information transmitted through the VAE channel. Within this framework, we explore different properties of the estimated posterior distribution, and highlight the trade-off between the amount of information encoded in a latent code during training, and the generative capacity of the model.
机译:已知由于诸如近似后验塌陷或潜在空间纠缠之类的问题,变分自动编码器(VAE)遭受学习输入的非信息性潜在表示的困扰。我们对VAE目标函数中的Kullback-Leibler(KL)散度项施加了明确的约束。虽然显式约束自然可以避免后部崩溃,但我们使用它来进一步理解KL术语在控制通过VAE通道传输的信息中的重要性。在此框架内,我们探索了估计后验分布的不同属性,并强调了训练期间以潜码编码的信息量与模型的生成能力之间的权衡。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号