首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Supervised Dimensionality Reduction Methods via Recursive Regression
【24h】

Supervised Dimensionality Reduction Methods via Recursive Regression

机译:通过递归回归监督维度减少方法

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

In this article, the recursive problems of both orthogonal linear discriminant analysis (OLDA) and orthogonal least squares regression (OLSR) are investigated. Different from other works, the associated recursive problems are addressed via a novel recursive regression method, which achieves the dimensionality reduction in the orthogonal complement space heuristically. As for the OLDA, an efficient method is developed to obtain the associated optimal subspace, which is closely related to the orthonormal basis of the optimal solution to the ridge regression. As for the OLSR, the scalable subspace is introduced to build up an original OLSR with optimal scaling (OS). Through further relaxing the proposed problem into a convex parameterized orthogonal quadratic problem, an effective approach is derived, such that not only the optimal subspace can be achieved but also the OS could be obtained automatically. Accordingly, two supervised dimensionality reduction methods are proposed via obtaining the heuristic solutions to the recursive problems of the OLDA and the OLSR.
机译:在本文中,研究了正交线性判别分析(OLDA)和正交最小二乘回归(OLSR)的递归问题。与其他作品不同,通过新颖的递归回归方法解决了相关的递归问题,其达到了启发性的正交补充空间的维度降低。对于Olda,开发了一种有效的方法来获得相关的最佳子空间,其与脊回归最佳解决方案的正交基础密切相关。至于OLSR,引入可伸缩子空间以构建具有最佳缩放(OS)的原始OLSR。通过进一步放松所提出的问题,进入凸起参数化正交二次问题,导出有效的方法,使得不仅可以实现最佳子空间,而且可以自动获得OS。因此,通过从旧式和OLSR的递归问题获得启发式解决方案,提出了两个监督维度减少方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号