...
首页> 外文期刊>Machine Learning >Geometry-aware principal component analysis for symmetric positive definite matrices
【24h】

Geometry-aware principal component analysis for symmetric positive definite matrices

机译:对称正定矩阵的几何感知主成分分析

获取原文
获取原文并翻译 | 示例

摘要

Symmetric positive definite (SPD) matrices in the form of covariance matrices, for example, are ubiquitous in machine learning applications. However, because their size grows quadratically with respect to the number of variables, high-dimensionality can pose a difficulty when working with them. So, it may be advantageous to apply to them dimensionality reduction techniques. Principal component analysis (PCA) is a canonical tool for dimensionality reduction, which for vector data maximizes the preserved variance. Yet, the commonly used, naive extensions of PCA to matrices result in sub-optimal variance retention. Moreover, when applied to SPD matrices, they ignore the geometric structure of the space of SPD matrices, further degrading the performance. In this paper we develop a new Riemannian geometry based formulation of PCA for SPD matrices that (1) preserves more data variance by appropriately extending PCA to matrix data, and (2) extends the standard definition from the Euclidean to the Riemannian geometries. We experimentally demonstrate the usefulness of our approach as pre-processing for EEG signals and for texture image classification.
机译:例如,协方差矩阵形式的对称正定(SPD)矩阵在机器学习应用中无处不在。但是,由于变量的大小相对于变量数量呈二次方增长,因此在处理它们时,高维可能会带来困难。因此,将降维技术应用于它们可能是有利的。主成分分析(PCA)是用于降维的规范工具,对于矢量数据,它可以最大程度地保留保留的方差。然而,PCA对矩阵的常用天真扩展导致次优的方差保留。此外,当应用于SPD矩阵时,它们会忽略SPD矩阵空间的几何结构,从而进一步降低性能。在本文中,我们针对SPD矩阵开发了一种新的基于黎曼几何的PCA公式,该公式(1)通过适当地将PCA扩展到矩阵数据来保留更多数据差异,并且(2)将标准定义从欧几里得几何扩展到黎曼几何。我们通过实验证明了我们的方法作为脑电信号预处理和纹理图像分类的有用性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号