首页> 外文期刊>IEICE Transactions on Information and Systems >Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting
【24h】

Superfast-Trainable Multi-Class Probabilistic Classifier by Least-Squares Posterior Fitting

机译:最小二乘后验拟合的超快训练多类概率分类器

获取原文
获取原文并翻译 | 示例
           

摘要

Kernel logistic regression (KLR) is a powerful and flexible classification algorithm, which possesses an ability to provide the confidence of class prediction. However, its training-typically carried out by (quasi-)Newton methods-is rather time-consuming. In this paper, we propose an alternative probabilistic classification algorithm called Least-Squares Probabilistic Classifier (LSPC). KLR models the class-posterior probability by the log-linear combination of kernel functions and its parameters are learned by (regularized) maximum likelihood. In contrast, LSPC employs the linear combination of kernel functions and its parameters are learned by regularized least-squares fitting of the true class-posterior probability. Thanks to this linear regularized least-squares formulation, the solution of LSPC can be computed analytically just by solving a regularized system of linear equations in a class-wise manner. Thus LSPC is computationally very efficient and numerically stable. Through experiments, we show that the computation time of LSPC is faster than that of KLR by two orders of magnitude, with comparable classification accuracy.
机译:内核逻辑回归(KLR)是一种功能强大且灵活的分类算法,它具有提供分类预测置信度的能力。但是,其训练(通常是通过(准)牛顿法进行的)非常耗时。在本文中,我们提出了另一种概率分类算法,称为最小二乘概率分类器(LSPC)。 KLR通过核函数的对数线性组合对类后验概率进行建模,并通过(正规化)最大似然学习其参数。相反,LSPC采用核函数的线性组合,并且其参数是通过对真实类-后验概率进行正则化最小二乘拟合来学习的。归功于这种线性正则化最小二乘公式,只需按类方式求解线性方程组的正则化系统,就可以解析地计算LSPC的解。因此,LSPC在计算上非常有效且在数值上稳定。通过实验,我们发现LSPC的计算时间比KLR的计算时间快两个数量级,并且分类精度相当。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号