首页> 美国卫生研究院文献>other >Accelerated Mini-batch Randomized Block Coordinate Descent Method
【2h】

Accelerated Mini-batch Randomized Block Coordinate Descent Method

机译:加速小批量随机块坐标下降法

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

We consider regularized empirical risk minimization problems. In particular, we minimize the sum of a smooth empirical risk function and a nonsmooth regularization function. When the regularization function is block separable, we can solve the minimization problems in a randomized block coordinate descent (RBCD) manner. Existing RBCD methods usually decrease the objective value by exploiting the partial gradient of a randomly selected block of coordinates in each iteration. Thus they need all data to be accessible so that the partial gradient of the block gradient can be exactly obtained. However, such a “batch” setting may be computationally expensive in practice. In this paper, we propose a mini-batch randomized block coordinate descent (MRBCD) method, which estimates the partial gradient of the selected block based on a mini-batch of randomly sampled data in each iteration. We further accelerate the MRBCD method by exploiting the semi-stochastic optimization scheme, which effectively reduces the variance of the partial gradient estimators. Theoretically, we show that for strongly convex functions, the MRBCD method attains lower overall iteration complexity than existing RBCD methods. As an application, we further trim the MRBCD method to solve the regularized sparse learning problems. Our numerical experiments shows that the MRBCD method naturally exploits the sparsity structure and achieves better computational performance than existing methods.
机译:我们考虑正规化的经验风险最小化问题。特别是,我们最小化了平滑的经验风险函数和非平滑的正则化函数之和。当正则化函数是块可分离的时,我们可以以随机块坐标下降(RBCD)的方式解决最小化问题。现有的RBCD方法通常通过在每次迭代中利用随机选择的坐标块的局部梯度来降低目标值。因此,它们需要所有数据都可访问,以便可以精确获得块梯度的部分梯度。但是,这种“批处理”设置在实践中可能在计算上昂贵。在本文中,我们提出了一种小批量随机块坐标下降(MRBCD)方法,该方法基于每次迭代中随机采样数据的小批量估计所选块的局部梯度。我们通过利用半随机优化方案进一步加速了MRBCD方法,这有效地减少了部分梯度估计量的方差。从理论上讲,我们表明对于强凸函数,MRBCD方法比现有的RBCD方法具有更低的总体迭代复杂度。作为应用,我们进一步修整了MRBCD方法,以解决正规化的稀疏学习问题。我们的数值实验表明,MRBCD方法自然会利用稀疏结构,并且比现有方法具有更好的计算性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号