首页> 外文会议>Conference on Neural Information Processing Systems;Annual conference on Neural Information Processing Systems >Fast Graph Laplacian Regularized Kernel Learning via Semidefinite-Quadratic-Linear Programming
【24h】

Fast Graph Laplacian Regularized Kernel Learning via Semidefinite-Quadratic-Linear Programming

机译:半定二次线性规划的快速图拉普拉斯正则化核学习

获取原文

摘要

Kernel learning is a powerful framework for nonlinear data modeling. Using the kernel trick, a number of problems have been formulated as semidefinite programs (SDPs). These include Maximum Variance Unfolding (MVU) (Weinberger et al., 2004) in nonlinear dimensionality reduction, and Pairwise Constraint Propagation (PCP) (Li et al., 2008) in constrained clustering. Although in theory SDPs can be efficiently solved, the high computational complexity incurred in numerically processing the huge linear matrix inequality constraints has rendered the SDP approach unscalable. In this paper, we show that a large class of kernel learning problems can be reformulated as semidefinite-quadratic-linear programs (SQLPs), which only contain a simple positive semidefinite constraint, a second-order cone constraint and a number of linear constraints. These constraints are much easier to process numerically, and the gain in speedup over previous approaches is at least of the order m~(2.5), where m is the matrix dimension. Experimental results are also presented to show the superb computational efficiency of our approach.
机译:内核学习是用于非线性数据建模的强大框架。使用内核技巧,许多问题已被表述为半定型程序(SDP)。这些包括非线性降维中的最大方差展开(MVU)(Weinberger等,2004)和约束聚类中的成对约束传播(PCP)(Li等,2008)。尽管从理论上讲可以有效地解决SDP,但是在数值上处理巨大的线性矩阵不等式约束时会产生很高的计算复杂度,这使SDP方法无法扩展。在本文中,我们表明,可以将大量的内核学习问题重构为半定二次线性程序(SQLP),它们仅包含一个简单的正半定约束,一个二阶锥约束和多个线性约束。这些约束在数值上更容易处理,并且与先前方法相比,加速的增益至少为m〜(2.5)量级,其中m为矩阵维数。实验结果也表明了我们方法的卓越的计算效率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号