首页> 外文会议>International Conference on Artificial Neural Networks >Nesterov Acceleration for the SMO Algorithm
【24h】

Nesterov Acceleration for the SMO Algorithm

机译:用于SMO算法的Nesterov加速

获取原文

摘要

We revise Nesterov's Accelerated Gradient(NAG)procedure for the SVM dual problem and propose a strictly monotone version of NAG that is capable of accelerating the second order version of the SMO algorithm. The higher computational cost of the resulting Nesterov Accelerated SMO(NA-SMO)is twice as high as that of SMO so the reduction in the number of iterations is not likely to translate in time savings for most problems. However, understanding NAG is presently an area of strong research and some of the resulting ideas may offer venues for even faster versions of NA-SMO.
机译:我们修改了Nesterov的加速梯度(NAG)程序,以获得SVM双问题,并提出了一个能够加速SMO算法的二阶版本的严格单调的NAG。所得Nesterov加速的Smo(Na-Smo)的较高计算成本是Smo的两倍,因此迭代次数的减少不太可能为大多数问题提供时间节省。然而,了解NAG目前是一个强大的研究领域,其中一些由此产生的思想可以为甚至更快的NA-SMO提供场地。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号