首页> 外文会议>18th Mediterranean Conference on Control Automation >Dimensionality reduction of RKHS model using Reduced Kernel Principal Component Analysis (RKPCA)
【24h】

Dimensionality reduction of RKHS model using Reduced Kernel Principal Component Analysis (RKPCA)

机译:使用简化核主成分分析(RKPCA)的RKHS模型降维

获取原文
获取外文期刊封面目录资料

摘要

This paper deals with the problem of complexity reduction of RKHS models developed on the Reproducing Kernel Hilbert Space (RKHS) using the statistical learning theory (SLT) devoted to supervised learning problems. However, the provided RKHS model suffers from the parameter number which equals the observations used in the learning phase. In this paper we propose a new way to reduce the number of parameters of RKHS model. The proposed method titled Reduced Kernel Principal Component Analysis (RKPCA) consists on approximating the retained principal components given by the KPCA method by a set of observation vectors which point to the directions of the largest variances with the retained principal components. The proposed method has been tested on a chemical reactor and the results were successful.
机译:本文使用专门研究监督学习问题的统计学习理论(SLT),解决了在再现内核希尔伯特空间(RKHS)上开发的RKHS模型的复杂性降低的问题。但是,提供的RKHS模型的参数编号等于学习阶段中使用的观察值。本文提出了一种减少RKHS模型参数数量的新方法。所提出的名为“减少内核主成分分析”(RKPCA)的方法包括:通过一组观察向量逼近KPCA方法给出的保留主成分,该观察向量指向与保留主成分最大方差的方向。所提出的方法已经在化学反应器上进行了测试,结果成功。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号