【24h】

A Note on Support Vector Machine Degeneracy

机译:关于支持向量机简并性的注记

获取原文
获取原文并翻译 | 示例

摘要

When trainign Support Vector Machines (SVMs) over non-separable data sets, one sets the threshold b using any dual cost coefficient that is strictly between the bounds of 0 and C. We show that there exist SVM training problems with dual optimal solutions with all coefficients at bounds, but that all such problems are degenerate in the sense that the "optimal separating hyperplane" is givne by w = 0, and the resulting (degenerate) SVM will classify all future points identically (to the class that supplies more training data). We also derive necessary and sufficient conditions on the input data for this to occur. Finally, we show that an SVM training problem can always be made degenerate by the addition of a single data point belonging to a certain unbounded polyhedron, which we characterize in terms of its extreme points and rays.
机译:当训练不可分割的数据集上的支持向量机(SVM)时,可以使用严格在0和C范围之间的任何对偶成本系数来设置阈值b。我们证明了存在对偶向量最优解的SVM训练问题但是,所有这些问题都是退化的,从某种意义上来说,“最优分离超平面”的定义为w = 0,并且所得的(退化的)SVM将对所有未来点进行相同的分类(分类为提供更多训练数据的类) )。我们还为输入数据得出了必要的充分条件。最后,我们表明,通过添加属于某个无边界多面体的单个数据点,总是可以使SVM训练问题退化,我们根据其极端点和射线来表征。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号