...
首页> 外文期刊>Statistics >Pruning a sufficient dimension reduction with a p-value guided hard-thresholding
【24h】

Pruning a sufficient dimension reduction with a p-value guided hard-thresholding

机译:使用p值引导的硬阈值修剪足够的尺寸缩减

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Principal fitted component (PFC) models are a class of likelihood-based inverse regression methods that yield a so-called sufficient reduction of the random p-vector of predictors X given the response Y. Assuming that a large number of the predictors has no information about Y, we aimed to obtain an estimate of the sufficient reduction that 'purges' these irrelevant predictors, and thus, select the most useful ones. We devised a procedure using observed significance values from the univariate fittings to yield a sparse PFC, a purged estimate of the sufficient reduction. The performance of the method is compared to that of penalized forward linear regression models for variable selection in high-dimensional settings.
机译:主拟合分量(PFC)模型是一类基于似然的逆回归方法,在给定响应Y的情况下,可以对预测变量X的随机p向量进行所谓的充分降低。假设大量预测变量没有信息关于Y,我们旨在获得足以“清除”这些不相关的预测变量的充分减少量的估计,从而选择最有用的预测变量。我们设计了一个程序,使用从单变量拟合中观察到的显着性值来生成稀疏的PFC,这是对充分减少的吹扫估计。该方法的性能与高维设置中变量选择的惩罚性正向线性回归模型的性能进行了比较。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号