首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Scalable Nonparametric Low-Rank Kernel Learning Using Block Coordinate Descent
【24h】

Scalable Nonparametric Low-Rank Kernel Learning Using Block Coordinate Descent

机译:使用块坐标下降的可扩展非参数低秩内核学习

获取原文
获取原文并翻译 | 示例
       

摘要

Nonparametric kernel learning (NPKL) is a flexible approach to learn the kernel matrix directly without assuming any parametric form. It can be naturally formulated as a semidefinite program (SDP), which, however, is not very scalable. To address this problem, we propose the combined use of low-rank approximation and block coordinate descent (BCD). Low-rank approximation avoids the expensive positive semidefinite constraint in the SDP by replacing the kernel matrix variable with , where is a low-rank matrix. The resultant nonlinear optimization problem is then solved by BCD, which optimizes each column of sequentially. It can be shown that the proposed algorithm has nice convergence properties and low computational complexities. Experiments on a number of real-world data sets show that the proposed algorithm outperforms state-of-the-art NPKL solvers.
机译:非参数内核学习(NPKL)是一种灵活的方法,可以直接学习内核矩阵而无需采用任何参数形式。它可以自然地公式化为半定型程序(SDP),但是扩展性不是很高。为了解决这个问题,我们建议结合使用低秩近似和块坐标下降(BCD)。低秩逼近通过将核矩阵变量替换为,从而避免了SDP中昂贵的正半定约束,其中低秩矩阵是。然后通过BCD解决由此产生的非线性优化问题,该BCD依次优化每一列。可以看出,该算法具有良好的收敛性和较低的计算复杂度。在许多实际数据集上进行的实验表明,所提出的算法优于最新的NPKL求解器。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号