首页> 外文会议>Proceedings of the 26th Annual International Conference on Machine Learning >A least squares formulation for a class of generalized eigenvalue problems in machine learning
【24h】

A least squares formulation for a class of generalized eigenvalue problems in machine learning

机译:机器学习中一类广义特征值问题的最小二乘公式

获取原文
获取原文并翻译 | 示例

摘要

Many machine learning algorithms can be formulated as a generalized eigenvalue problem. One major limitation of such formulation is that the generalized eigenvalue problem is computationally expensive to solve especially for large-scale problems. In this paper, we show that under a mild condition, a class of generalized eigenvalue problems in machine learning can be formulated as a least squares problem. This class of problems include classical techniques such as Canonical Correlation Analysis (CCA), Partial Least Squares (PLS), and Linear Discriminant Analysis (LDA), as well as Hypergraph Spectral Learning (HSL). As a result, various regularization techniques can be readily incorporated into the formulation to improve model sparsity and generalization ability. In addition, the least squares formulation leads to efficient and scalable implementations based on the iterative conjugate gradient type algorithms. We report experimental results that confirm the established equivalence relationship. Results also demonstrate the efficiency and effectiveness of the equivalent least squares formulations on large-scale problems.
机译:许多机器学习算法可以表述为广义特征值问题。这种表述的一个主要限制是广义特征值问题在解决特别是对于大规模问题时在计算上是昂贵的。在本文中,我们证明了在温和条件下,机器学习中的一类广义特征值问题可以表述为最小二乘问题。这类问题包括经典技术,例如规范相关分析(CCA),偏最小二乘(PLS)和线性判别分析(LDA)以及超图光谱学习(HSL)。结果,各种正则化技术可以容易地并入制剂中以改善模型稀疏性和泛化能力。另外,最小二乘公式表示基于迭代共轭梯度类型算法,可实现高效且可扩展的实现。我们报告的实验结果证实了建立的等效关系。结果还证明了等效的最小二乘公式在大规模问题上的效率和有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号