首页> 外文会议>Annual conference on Neural Information Processing Systems >Training sparse natural image models with a fast Gibbs sampler of an extended state space
【24h】

Training sparse natural image models with a fast Gibbs sampler of an extended state space

机译:使用扩展状态空间的快速Gibbs采样器训练稀疏自然图像模型

获取原文

摘要

We present a new learning strategy based on an efficient blocked Gibbs sampler for sparse overcomplete linear models. Particular emphasis is placed on statistical image modeling, where overcomplete models have played an important role in discovering sparse representations. Our Gibbs sampler is faster than general purpose sampling schemes while also requiring no tuning as it is free of parameters. Using the Gibbs sampler and a persistent variant of expectation maximization, we are able to extract highly sparse distributions over latent sources from data. When applied to natural images, our algorithm learns source distributions which resemble spike-and-slab distributions. We evaluate the likelihood and quantitatively compare the performance of the overcomplete linear model to its complete counterpart as well as a product of experts model, which represents another overcomplete generalization of the complete linear model. In contrast to previous claims, we find that overcomplete representations lead to significant improvements, but that the overcomplete linear model still underperforms other models.
机译:我们提出了一种基于有效的封闭Gibbs采样器的稀疏过完全线性模型的新学习策略。特别强调统计图像建模,其中过分完整的模型在发现稀疏表示中起了重要作用。我们的Gibbs采样器比通用采样方案要快,同时由于没有参数,因此也不需要调整。使用Gibbs采样器和期望最大化的持久变体,我们能够从数据中提取潜在源上的高度稀疏分布。当应用于自然图像时,我们的算法将学习类似于尖峰-台阶分布的源分布。我们评估了可能性,并定量比较了超完全线性模型与其完全对应的模型以及专家模型的乘积的性能,这代表了完全线性模型的另一个超完全概括。与先前的主张相反,我们发现过度完成的表示形式会导致重大改进,但是过度完成的线性模型仍然不如其他模型。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号