...
首页> 外文期刊>Neural Networks, IEEE Transactions on >Clustered Nyström Method for Large Scale Manifold Learning and Dimension Reduction
【24h】

Clustered Nyström Method for Large Scale Manifold Learning and Dimension Reduction

机译:大规模流形学习和降维的聚类Nyström方法

获取原文
获取原文并翻译 | 示例

摘要

Kernel (or similarity) matrix plays a key role in many machine learning algorithms such as kernel methods, manifold learning, and dimension reduction. However, the cost of storing and manipulating the complete kernel matrix makes it infeasible for large problems. The Nyström method is a popular sampling-based low-rank approximation scheme for reducing the computational burdens in handling large kernel matrices. In this paper, we analyze how the approximating quality of the Nyström method depends on the choice of landmark points, and in particular the encoding powers of the landmark points in summarizing the data. Our (non-probabilistic) error analysis justifies a “clustered Nyström method” that uses the ${k}$-means clustering centers as landmark points. Our algorithm can be applied to scale up a wide variety of algorithms that depend on the eigenvalue decomposition of kernel matrix (or its variant), such as kernel principal component analysis, Laplacian eigenmap, spectral clustering, as well as those involving kernel matrix inverse such as least-squares support vector machine and Gaussian process regression. Extensive experiments demonstrate the competitive performance of our algorithm in both accuracy and efficiency.
机译:内核(或相似性)矩阵在许多机器学习算法(例如内核方法,流形学习和降维)中起关键作用。但是,存储和处理完整内核矩阵的成本使得它对于大问题不可行。 Nyström方法是一种流行的基于采样的低秩逼近方案,用于减少处理大型内核矩阵时的计算负担。在本文中,我们分析了Nyström方法的近似质量如何取决于地标点的选择,尤其是在汇总数据时地标点的编码能力。我们的(非概率性)错误分析证明了一种使用“ $ {k} $-均值”聚类中心作为界标点的“聚类Nyström方法”。我们的算法可用于扩展各种依赖于核矩阵(或其变体)特征值分解的算法,例如核主成分分析,拉普拉斯特征图,谱聚类以及涉及核矩阵逆的算法作为最小二乘支持向量机和高斯过程回归。大量实验证明了我们算法在准确性和效率上的竞争性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号