首页> 外文会议>International conference on neural information processing;ICONIP 2011 >On Low-Rank Regularized Least Squares for Scalable Nonlinear Classification
【24h】

On Low-Rank Regularized Least Squares for Scalable Nonlinear Classification

机译:可分级非线性分类的低秩正则最小二乘

获取原文

摘要

In this paper, we revisited the classical technique of Regularized Least Squares (RLS) for the classification of large-scale nonlinear data. Specifically, we focus on a low-rank formulation of RLS and show that it has linear time complexity in the data size only and does not rely on the number of labels and features for problems with moderate feature dimension. This makes low-rank RLS particularly suitable for classification with large data sets. Moreover, we have proposed a general theorem for the closed-form solutions to the Leave-One-Out Cross Validation (LOOCV) estimation problem in empirical risk minimization which encompasses all types of RLS classifiers as special cases. This eliminates the reliance on cross validation, a computationally expensive process for parameter selection, and greatly accelerate the training process of RLS classifiers. Experimental results on real and synthetic large-scale benchmark data sets have shown that low-rank RLS achieves comparable classification performance while being much more efficient than standard kernel SVM for nonlinear classification. The improvement in efficiency is more evident for data sets with higher dimensions.
机译:在本文中,我们重新审视了正则化最小二乘(RLS)的经典技术,用于大规模非线性数据的分类。具体来说,我们专注于RLS的低阶表述,并表明它仅在数据大小上具有线性时间复杂度,并且不依赖标签和特征的数量来解决中等特征尺寸的问题。这使得低等级的RLS特别适合于具有大数据集的分类。此外,我们为经验风险最小化中的留一法交叉验证(LOOCV)估计问题的闭式解提出了一个通用定理,该定理涵盖了所有类型的RLS分类器作为特例。这消除了对交叉验证的依赖,该交叉验证是一种用于参数选择的计算量巨大的过程,并大大加快了RLS分类器的训练过程。在真实和合成的大型基准数据集上的实验结果表明,低阶RLS可以实现可比的分类性能,同时比用于非线性分类的标准内核SVM效率更高。对于具有更高维度的数据集,效率的提高更为明显。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号