首页> 外文会议>IAPR Workshop on Artificial Neural Networks in Pattern Recognition(ANNPR 2006); 20060831-0902; Ulm(DE) >Fast Training of Linear Programming Support Vector Machines Using Decomposition Techniques
【24h】

Fast Training of Linear Programming Support Vector Machines Using Decomposition Techniques

机译:利用分解技术快速训练线性规划支持向量机

获取原文
获取原文并翻译 | 示例

摘要

Decomposition techniques are used to speed up training support vector machines but for linear programming support vector machines (LP-SVMs) direct implementation of decomposition techniques leads to infinite loops. To solve this problem and to further speed up training, in this paper, we propose an improved decomposition techniques for training LP-SVMs. If an infinite loop is detected, we include in the next working set all the data in the working sets that form the infinite loop. To further accelerate training, we improve a working set selection strategy: at each iteration step, we check the number of violations of complementarity conditions and constraints. If the number of violations increases, we conclude that the important data are removed from the working set and restore the data into the working set. The computer experiments demonstrate that training by the proposed decomposition technique with improved working set selection is drastically faster than that without using the decomposition technique. Furthermore, it is always faster than that without improving the working set selection for all the cases tested.
机译:分解技术用于加速训练支持向量机,但对于线性编程支持向量机(LP-SVM),分解技术的直接实现会导致无限循环。为了解决该问题并进一步加快训练速度,在本文中,我们提出了一种用于训练LP-SVM的改进分解技术。如果检测到无限循环,我们将在下一个工作集中包含构成无限循环的工作集中的所有数据。为了进一步加快训练速度,我们改进了工作集选择策略:在每个迭代步骤中,我们检查违反互补条件和约束的次数。如果违规次数增加,我们得出的结论是,重要数据已从工作集中删除,并将数据还原到工作集中。计算机实验表明,与不使用分解技术的情况相比,采用改进工作集选择的拟议分解技术的训练要快得多。此外,对于所有测试的案例,它总是比没有改进工作集选择的情况更快。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号