首页> 外文学位 >Nonparametric Bayesian models for supervised dimension reduction and regression.
【24h】

Nonparametric Bayesian models for supervised dimension reduction and regression.

机译:非参数贝叶斯模型,用于监督维数减少和回归。

获取原文
获取原文并翻译 | 示例

摘要

We propose nonparametric Bayesian models for supervised dimension reduction and regression problems. Supervised dimension reduction is a setting where one needs to reduce the dimensionality of the predictors or find the dimension reduction subspace and lose little or no predictive information. Our first method retrieves the dimension reduction subspace in the inverse regression framework by utilizing a dependent Dirichlet process that allows for natural clustering for the data in terms of both the response and predictor variables. Our second method is based on ideas from the gradient learning framework and retrieves the dimension reduction subspace through coherent nonparametric Bayesian kernel models. We also discuss and provide a new rationalization of kernel regression based on nonparametric Bayesian models allowing for direct and formal inference on the uncertain regression functions. Our proposed models apply for high dimensional cases where the number of variables far exceed the sample size, and hold for both the classical setting of Euclidean subspaces and the Riemannian setting where the marginal distribution is concentrated on a manifold. Our Bayesian perspective adds appropriate probabilistic and statistical frameworks that allow for rich inference such as uncertainty estimation which is important for measuring the estimates. Formal probabilistic models with likelihoods and priors are given and efficient posterior sampling can be obtained by Markov chain Monte Carlo methodologies, particularly Gibbs sampling schemes. For the supervised dimension reduction as the posterior draws are linear subspaces which are points on a Grassmann manifold, we do the posterior inference with respect to geodesics on the Grassmannian. The utility of our approaches is illustrated on simulated and real examples.
机译:我们提出了有监督的降维和回归问题的非参数贝叶斯模型。监督降维是一种需要降低预测变量的维数或找到降维子空间并且几乎没有或几乎没有丢失预测信息的设置。我们的第一种方法是利用相关的Dirichlet过程在逆回归框架中检索降维子空间,该过程允许根据响应变量和预测变量对数据进行自然聚类。我们的第二种方法基于梯度学习框架中的思想,并通过相干的非参数贝叶斯内核模型检索降维子空间。我们还讨论并提供了一种基于非参数贝叶斯模型的核回归新的合理化方法,可以对不确定的回归函数进行直接和形式上的推断。我们提出的模型适用于变量数量远远超过样本大小的高维情况,并适用于欧几里德子空间的经典设置和边际分布集中在流形上的黎曼设置。我们的贝叶斯观点添加了适当的概率和统计框架,这些框架允许进行丰富的推断,例如不确定性估计,这对于测量估计值很重要。给出了具有可能性和先验性的形式概率模型,并且可以通过马尔可夫链蒙特卡洛方法,尤其是吉布斯采样方案来获得有效的后验采样。对于后视图是格拉斯曼流形上的点的线性子空间的有监督维降,我们对格拉斯曼方程上的测地线进行了后推。我们的方法的实用性在模拟和真实示例中得到了说明。

著录项

  • 作者

    Mao, Kai.;

  • 作者单位

    Duke University.;

  • 授予单位 Duke University.;
  • 学科 Statistics.
  • 学位 Ph.D.
  • 年度 2009
  • 页码 118 p.
  • 总页数 118
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

  • 入库时间 2022-08-17 11:38:09

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号