...
首页> 外文期刊>International journal of machine learning and cybernetics >Large-scale support vector regression with budgeted stochastic gradient descent
【24h】

Large-scale support vector regression with budgeted stochastic gradient descent

机译:大规模支持向量回归与预算随机梯度下降

获取原文
获取原文并翻译 | 示例

摘要

Support vector regression (SVR) is a widely used regression technique for its competent performance. However, non-linear SVR is time consuming for large-scale tasks due to the dimension curse of kernelization. Recently, a budgeted stochastic gradient descent (BSGD) method has been developed to train large-scale kernelized SVC. In this paper, we extend the BSGD method to non-linear regression tasks. According to the performance of different budget maintenance strategies, we combine the stochastic gradient descent (SGD) method with the merging strategy. Experimental results on real-world datasets show that the proposed kernelized SVR with BSGD can achieve competent accuracy, with good computational efficiency compared to some state-of-the-art algorithms.
机译:支持向量回归(SVR)是一种广泛使用的回归技术,以实现其称职的性能。然而,由于核心化的尺寸诅咒,非线性SVR对于大规模任务是耗时的。最近,已经开发了预算的随机梯度下降(BSGD)方法以培训大规模的核化SVC。在本文中,我们将BSGD方法扩展到非线性回归任务。根据不同预算维护策略的表现,我们将随机梯度下降(SGD)方法与合并策略相结合。实验结果对现实世界数据集表明,与BSGD的建议的核化SVR可以实现称重的准确性,与某些最先进的算法相比,具有良好的计算效率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号