...
首页> 外文期刊>Neurocomputing >A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine
【24h】

A comparison of PCA, KPCA and ICA for dimensionality reduction in support vector machine

机译:支持向量机中降维的PCA,KPCA和ICA的比较

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Recently, support vector machine (SVM) has become a popular tool in time series forecasting. In developing a successful SVM forecaster, the first step is feature extraction. This paper proposes the applications of principal component analysis (PCA), kernel principal component analysis (KPCA) and independent component analysis (ICA) to SVM for feature extraction. PCA linearly transforms the original inputs into new uncorrelated features. KPCA is a nonlinear PCA developed by using the kernel method. In ICA, the original inputs are linearly transformed into features which are mutually statistically independent. By examining the sunspot data, Santa Fe data set A and five real futures contracts, the experiment shows that SVM by feature extraction using PCA, KPCA or ICA can perform better than that without feature extraction. Furthermore, among the three methods, there is the best performance in KPCA feature extraction, followed by ICA feature extraction.
机译:最近,支持向量机(SVM)已成为时间序列预测中的一种流行工具。在开发成功的SVM预测器时,第一步是特征提取。本文提出将主成分分析(PCA),内核主成分分析(KPCA)和独立成分分析(ICA)在支持向量机中用于特征提取的应用。 PCA将原始输入线性转换为新的不相关特征。 KPCA是使用核方法开发的非线性PCA。在ICA中,原始输入被线性转换为相互统计独立的特征。通过检查黑子数据,圣达菲数据集A和五个真实的期货合约,实验表明,使用PCA,KPCA或ICA进行特征提取的SVM可以比不进行特征提取的SVM表现更好。此外,在这三种方法中,在KPCA特征提取中表现最佳,其次是ICA特征提取。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号