首页> 外文会议>International conference on artificial neural networks >Nesterov Acceleration for the SMO Algorithm
【24h】

Nesterov Acceleration for the SMO Algorithm

机译:Nesterov加速用于SMO算法

获取原文

摘要

We revise Nesterov's Accelerated Gradient (NAG) procedure for the SVM dual problem and propose a strictly monotone version of NAG that is capable of accelerating the second order version of the SMO algorithm. The higher computational cost of the resulting Nesterov Accelerated SMO (NA-SMO) is twice as high as that of SMO so the reduction in the number of iterations is not likely to translate in time savings for most problems. However, understanding NAG is presently an area of strong research and some of the resulting ideas may offer venues for even faster versions of NA-SMO.
机译:我们修改了Nesterov的SVM对偶问题加速梯度(NAG)程序,并提出了NAG的严格单调版本,该版本能够加速SMO算法的二阶版本。生成的Nesterov加速SMO(NA-SMO)的较高计算成本是SMO的两倍,因此减少迭代次数不太可能在大多数问题上节省时间。但是,目前了解NAG是一个有力的研究领域,某些由此产生的想法可能会为更快版本的NA-SMO提供场所。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号