首页> 外文期刊>Neural processing letters >Improved Sparsity of Support Vector Machine with Robustness Towards Label Noise Based on Rescaled α-Hinge Loss with Non-smooth Regularizer
【24h】

Improved Sparsity of Support Vector Machine with Robustness Towards Label Noise Based on Rescaled α-Hinge Loss with Non-smooth Regularizer

机译:基于具有非平滑常规器的重新定义α铰链损耗,改善了支持向量机的疲劳稳健性稳健性稳健性

获取原文
获取原文并翻译 | 示例

摘要

As support vector machines (SVM) are used extensively in machine learning applications, it becomes essential to obtain a sparse model that is also robust to noise in the data set. Although many researchers have presented different approaches to get a robust SVM, the work on robust SVM based on rescaled hinge loss function (RSVM-RHHQ) has attracted a great deal of attention. The method of using correntropy with hinge loss function has added a noticeable amount of robustness to the model. However, the sparsity of the model can be further improved. In this work, we focus on enhancing the sparsity of RSVM-RHHQ. As this work is the improved version of the RSVM-RHHQ, we follow the same track (of adding noise in the data) of RSVM-RHHQ with altogether a new problem formulation. We apply correntropy to the α-hinge loss function, which results in a better loss function than the rescaled hinge loss function. We use a non-smooth regularizer with a non-convex and non-smooth loss function. We solve this non-smooth and non-convex problem using the primal-dual proximal method. We find that this combination not only adds sparsity to the model, but it is also better than the existing robust SVM methods in terms of robustness towards label noise. We also provide the convergence proof of the proposed approach. In addition, the time complexity of the optimization technique is included. We perform experiments over various publicly available real-world data sets to compare the proposed method with the existing robust SVM methods. For experimentation purposes, we use small data sets, large data sets, and also data sets with significant class imbalance. Experimental results show that the proposed approach outperforms existing methods in sparseness, accuracy, and robustness. We also provide the sensitivity analysis of the regularization parameter for the label noise in the data set.
机译:由于支持向量机(SVM)在机器学习应用中广泛使用,因此可以获得对数据集中的噪声具有稳健的稀疏模型。虽然许多研究人员呈现出不同的方法来获得稳健的SVM,但基于重新定义的铰链损失功能(RSVM-RHHQ)的强大SVM的工作引起了大量的关注。使用铰链损耗函数的使用控制方法对模型添加了明显的鲁棒性。然而,可以进一步提高模型的稀疏性。在这项工作中,我们专注于增强RSVM-RHHQ的稀疏性。由于这项工作是RSVM-RHHQ的改进版本,我们遵循RSVM-RHHQ的相同曲目(在数据中添加噪音),同时具有新的问题制定。我们将管道施加到α铰链损耗函数,这导致比重新定位的铰链损耗功能更好的损失功能。我们使用具有非凸面和非平滑损耗功能的非平滑常规器。我们使用Primal-Dual近端方法解决了这种非平滑和非凸面问题。我们发现这种组合不仅为模型添加了稀疏性,而且比现有的鲁棒SVM方法在稳健性朝向标签噪声方面更好。我们还提供所提出的方法的收敛证明。此外,包括优化技术的时间复杂性。我们对各种公开的真实数据集进行实验,以比较具有现有鲁棒SVM方法的提出方法。对于实验目的,我们使用小数据集,大数据集以及具有重要类别不平衡的数据集。实验结果表明,该方法优于稀疏,准确性和鲁棒性的现有方法。我们还提供了数据集中标签噪声的正则化参数的敏感性分析。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号