【24h】

Stein Points

机译:斯坦因点

获取原文
       

摘要

An important task in computational statistics and machine learning is to approximate a posterior distribution $p(x)$ with an empirical measure supported on a set of representative points ${x_i}_{i=1}^n$. This paper focuses on methods where the selection of points is essentially deterministic, with an emphasis on achieving accurate approximation when $n$ is small. To this end, we present Stein Points. The idea is to exploit either a greedy or a conditional gradient method to iteratively minimise a kernel Stein discrepancy between the empirical measure and $p(x)$. Our empirical results demonstrate that Stein Points enable accurate approximation of the posterior at modest computational cost. In addition, theoretical results are provided to establish convergence of the method.
机译:计算统计和机器学习中的一项重要任务是,使用在一组代表点$ {x_i } _ {i = 1} ^ n $上支持的经验测度来近似后验分布$ p(x)$。本文着重于点的选择基本上是确定性的方法,重点是在$ n $小时实现精确逼近。为此,我们提出斯坦因分。这个想法是利用贪婪或有条件的梯度方法来迭代地最小化经验测度和$ p(x)$之间的核心Stein差异。我们的经验结果表明,斯坦因点可以适度的计算成本实现后验的精确近似。另外,提供理论结果以建立该方法的收敛性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号