首页> 外文会议>AAAI Conference on Artificial Intelligence >Improving Efficiency of SVM k-Fold Cross-Validation by Alpha Seeding
【24h】

Improving Efficiency of SVM k-Fold Cross-Validation by Alpha Seeding

机译:通过α播种提高SVM k折叠交叉验证的效率

获取原文

摘要

The k-fold cross-validation is commonly used to evaluate the effectiveness of SVMs with the selected hyper-parameters. It is known that the SVM k-fold cross-validation is expensive, since it requires training k SVMs. However, little work has explored reusing the h~(th) SVM for training the (h + 1)~(th) SVM for improving the efficiency of k-fold cross-validation. In this paper, we propose three algorithms that reuse the h~(th) SVM for improving the efficiency of training the (h + 1)~(th) SVM. Our key idea is to efficiently identify the support vectors and to accurately estimate their associated weights (also called alpha values) of the next SVM by using the previous SVM. Our experimental results show that our algorithms are several times faster than the k-fold cross-validation which does not make use of the previously trained SVM. Moreover, our algorithms produce the same results (hence same accuracy) as the k-fold cross-validation which does not make use of the previously trained SVM.
机译:k折叠交叉验证通常用于评估SVMS与所选的超参数的有效性。众所周知,SVM k折叠交叉验证是昂贵的,因为它需要训练K SVM。然而,很少的工作已经探索了用于训练(H + 1)〜(Th)SVM的H〜(Th)SVM,以提高k折交叉验证的效率。在本文中,我们提出了三种算法,可重复使用H〜(Th)SVM以提高训练(H + 1)〜(TH)SVM的效率。我们的关键思想是有效地识别支持向量,并通过使用先前的SVM来准确地估计下一个SVM的相关权重(也称为alpha值)。我们的实验结果表明,我们的算法比k折交叉验证的速度快几倍,这不使用先前训练的SVM。此外,我们的算法产生相同的结果(因此相同的准确性),作为不使用先前训练的SVM的k折交叉验证。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号