首页> 外文会议>Annual conference on Neural Information Processing Systems >Learning with Symmetric Label Noise: The Importance of Being Unhinged
【24h】

Learning with Symmetric Label Noise: The Importance of Being Unhinged

机译:带有对称标签噪声的学习:取消铰链的重要性

获取原文

摘要

Convex potential minimisation is the de facto approach to binary classification. However, Long and Servedio [2010] proved that under symmetric label noise (SLN), minimisation of any convex potential over a linear function class can result in classification performance equivalent to random guessing. This ostensibly shows that convex losses are not SLN-robust. In this paper, we propose a convex, classification-calibrated loss and prove that it is SLN-robust. The loss avoids the Long and Servedio [2010] result by virtue of being negatively unbounded. The loss is a modification of the hinge loss, where one does not clamp at zero; hence, we call it the unhinged loss. We show that the optimal unhinged solution is equivalent to that of a strongly regularised SVM, and is the limiting solution for any convex potential; this implies that strong l_2 regularisation makes most standard learners SLN-robust. Experiments confirm the unhinged loss' SLN-robustness is borne out in practice. So, with apologies to Wilde [1895], while the truth is rarely pure, it can be simple.
机译:凸势最小化是二元分类的实际方法。然而,Long and Servedio [2010]证明,在对称标签噪声(SLN)下,线性函数类上的任何凸形电位的最小化都可能导致分类性能等同于随机猜测。从表面上看,这表明凸损耗不是SLN稳健的。在本文中,我们提出了一个凸的,经分类校准的损耗,并证明它是SLN稳健的。这种损失避免了Long and Servedio [2010]的结果,因为它具有消极的约束力。该损耗是铰链损耗的一种修改,其中一个不钳位为零;另一个为零。因此,我们称其为意外损失。我们表明,最佳的非铰链解决方案与强正则化SVM的解决方案等效,并且是任何凸势的极限解决方案。这意味着强大的l_2正则化使得大多数标准学习者都具有较强的SLN鲁棒性。实验证实,意外损失的SLN稳健性已在实践中得到证实。因此,向王尔德[1895]道歉,尽管事实很少是纯真的,但事实可能很简单。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号