首页> 中文期刊>电子科技大学学报 >截断误差的光滑型支持向量顺序回归

截断误差的光滑型支持向量顺序回归

     

摘要

Support vector ordinal regression (SVOR) has been proven to be the promising algorithm for solving ordinal regression problems. However, its performance tends to be strongly affected by outliers in the training datasets. To remedy this drawback, a truncated loss smooth SVOR (TLS-SVOR) is proposed. While learning ordinal regression models, the loss s of the misranked sample is bounded between 0 and the truncated coefficient u. First, a piecewise polynomial function with parameter u is approximated to s. Then, by applying the strategy of smooth support vector machine for classification, the optimization problem is replaced with an unconstrained function which is twice continuously differentiable. The algorithm employs Newton’s method to obtain the unique discriminant hyperplane. The optimal parameter combination of TLS-SVOR is determined by a two-stage uniform designed model selection methodology. The experimental results on benchmark datasets show that TLS-SVOR has advantage in terms of accuracy over other ordinal regression approaches.%支持向量顺序回归算法已成功应用于解决顺序回归问题,但其易受训练样本中野点的干扰。为此,提出一种截断误差的光滑型支持向量顺序回归(TLS-SVOR)算法。学习顺序回归模型时,将错划样本形成的误差s限制在范围u内。TLS-SVOR首先用包含参数u的分段多项式近似s;再引入光滑型支持向量机分类算法的思路,将优化目标转变为二次连续可微的无约束问题,从而由牛顿法直接求得唯一的决策超平面。采用两阶段的均匀设计方法确定TLS-SVOR的最优参数。实验结果表明,相比其他顺序回归算法,TLS-SVOR在多个数据集能获得更高的精度。

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号