首页> 外文会议>Asilomar Conference on Signals, Systems, and Computers >Approximate Log-Determinant Divergences Between Covariance Operators and Applications
【24h】

Approximate Log-Determinant Divergences Between Covariance Operators and Applications

机译:协方差算子与应用程序之间的近似对数行列式

获取原文

摘要

Covariance matrices and covariance operators have been playing increasingly important roles in numerous applications in machine learning, computer vision, image and signal processing. An active current research direction on covariance matrices and operators involves the exploitation of their intrinsic non-Euclidean geometrical structures for optimal practical performance. In this work, we consider the Log-Determinant divergences, which is a family of parametrized divergences encompassing many different divergences and distances between covariance matrices and operators, including the affine-invariant Riemannian distance and symmetric Stein divergence. In particular, we present finite-dimensional approximations of the infinite-dimensional Log-Determinant divergences between covariance operators, which consistently estimate the exact versions and at the same time can be substantially more efficient to compute. Computationally, we focus on covariance operators in reproducing kernel Hilbert spaces. For the Hellinger distance, defined using the symmetric Stein divergence, we obtain a two-layer kernel machine defined using both the mean vector and covariance operator. The theoretical formulation is accompanied by numerical experiments in computer vision.
机译:在机器学习,计算机视觉,图像和信号处理的众多应用中,协方差矩阵和协方差运算符已扮演着越来越重要的角色。当前对协方差矩阵和运算符的积极研究方向涉及开发其固有的非欧几里得几何结构,以实现最佳的实际性能。在这项工作中,我们考虑对数行列式散度,它是一个参数化散度的族,包含许多不同的散度和协方差矩阵与算子之间的距离,包括仿射不变的黎曼距离和对称的斯坦因散度。特别是,我们提出了协方差算子之间的无穷维对数行列式散度的无穷维近似值,它们一致地估算出确切的版本,同时可以大大提高计算效率。在计算上,我们专注于重现内核希尔伯特空间的协方差算子。对于使用对称斯坦因散度定义的赫林格距离,我们获得了使用均值矢量和协方差算子定义的两层内核机器。理论公式伴随着计算机视觉中的数值实验。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号