首页> 外文期刊>Computational statistics & data analysis >Dimension reduction using the generalized gradient direction
【24h】

Dimension reduction using the generalized gradient direction

机译:使用广义梯度方向进行降维

获取原文
获取原文并翻译 | 示例
           

摘要

Sufficient dimension reduction methods, such as the sliced inverse regression one(SIR)and the sliced average variance estimate one (SAVE), usually put restrictions on the regressor: X being elliptical or normal. We propose a new effective method, called the generalized gradient direction method (GGD), for solving sufficient dimension reduction problems. Compared with SIR, SAVE etc., GGD makes very weak assumptions on X and performs well with X being a continuous variable or a numerical discrete variable, while existing methods are all developed with X being a continuous variable. The computation for GGD is very simple, just like for SIR, SAVE etc. Moreover, GGD proves robust compared with many standard techniques. Simulation results in comparison with results from other methods support the advantages of GGD.
机译:足够的降维方法,例如切片逆回归一(SIR)和切片平均方差估计一(SAVE),通常会对回归变量施加限制:X为椭圆形或正态。我们提出了一种新的有效方法,称为广义梯度方向方法(GGD),用于解决足够的降维问题。与SIR,SAVE等相比,GGD对X的假设非常微弱,并且在X为连续变量或数值离散变量的情况下表现良好,而现有方法都以X为连续变量开发。像SIR,SAVE等一样,GGD的计算非常简单。此外,与许多标准技术相比,GGD的功能更强大。与其他方法得出的结果相比,仿真结果支持了GGD的优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号