...
首页> 外文期刊>International Journal of Information Technology & Decision Making >UNSUPERVISED FEATURE SELECTION USING INCREMENTAL LEAST SQUARES
【24h】

UNSUPERVISED FEATURE SELECTION USING INCREMENTAL LEAST SQUARES

机译:使用递增最小二乘法的未监督特征选择

获取原文
获取原文并翻译 | 示例

摘要

An unsupervised feature selection method is proposed for analysis of datasets of high dimensionality. The least square error (LSE) of approximating the complete dataset via a reduced feature subset is proposed as the quality measure for feature selection. Guided by the minimization of the LSE, a kernel least squares forward selection algorithm (KLS-FS) is developed that is capable of both linear and non-linear feature selection. An incremental LSE computation is designed to accelerate the selection process and, therefore, enhances the scalability of KLS-FS to high-dimensional datasets. The superiority of the proposed feature selection algorithm, in terms of keeping principal data structures, learning performances in classification and clustering applications, and robustness, is demonstrated using various real-life datasets of different sizes and dimensions.
机译:提出了一种无监督的特征选择方法来分析高维数据集。提出了通过减少的特征子集近似完整数据集的最小平方误差(LSE)作为特征选择的质量度量。在最小化LSE的指导下,开发了一种能够同时进行线性和非线性特征选择的内核最小二乘正向选择算法(KLS-FS)。增量LSE计算旨在加速选择过程,因此增强了KLS-FS对高维数据集的可伸缩性。使用各种不同大小和尺寸的真实数据集,在保持主要数据结构,分类和聚类应用中的学习性能以及鲁棒性方面,提出的特征选择算法具有优越性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号