首页> 外文会议>European Conference on Machine Learning and Knowledge Discovery in Databases;ECML PKDD 2008 >Semi-supervised Laplacian Regularization of Kernel Canonical Correlation Analysis
【24h】

Semi-supervised Laplacian Regularization of Kernel Canonical Correlation Analysis

机译:核规范相关分析的半监督拉普拉斯正则化

获取原文

摘要

Kernel canonical correlation analysis (KCCA) is a dimensionality reduction technique for paired data. By finding directions that maximize correlation, KCCA learns representations that are more closely tied to the underlying semantics of the data rather than noise. However, meaningful directions are not only those that have high correlation to another modality, but also those that capture the manifold structure of the data. We propose a method that is simultaneously able to find highly correlated directions that are also located on high variance directions along the data manifold. This is achieved by the use of semi-supervised Laplacian regularization of KCCA. We show experimentally that Laplacian regularized training improves class separation over KCCA with only Tikhonov regularization, while causing no degradation in the correlation between modalities. We propose a model selection criterion based on the Hilbert-Schmidt norm of the semi-supervised Laplacian regularized cross-covariance operator, which we compute in closed form.
机译:核规范相关分析(KCCA)是用于配对数据的降维技术。通过找到使相关性最大化的方向,KCCA可以学习与数据的底层语义(而不是噪声)更加紧密相关的表示形式。但是,有意义的方向不仅是与另一模态具有高度相关性的方向,而且是捕获数据的多种结构的方向。我们提出了一种方法,该方法可以同时找到高度相关的方向,这些方向也沿着数据流形位于高方差方向上。这是通过使用KCCA的半监督拉普拉斯正则化来实现的。我们通过实验证明,拉普拉斯正则化训练仅通过Tikhonov正则化就能改善KCCA的类分离,而不会导致模态之间的相关性降低。我们提出了一个基于半监督拉普拉斯正则化交叉协方差算子的希尔伯特-施密特范数的模型选择准则,我们以封闭形式对其进行计算。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号