首页> 外文期刊>Neurocomputing >Gradient preconditioned mini-batch SGD for ridge regression
【24h】

Gradient preconditioned mini-batch SGD for ridge regression

机译:Ridge回归的梯度预处理迷你批量SGD

获取原文
获取原文并翻译 | 示例
       

摘要

Data preconditioning technique, which reduces the condition number of the problem by a linear transformation of the data matrix, is typically used to accelerate the convergence of the first-order optimization methods for regularized loss minimization. One obvious limitation of the technique is exceedingly expensive of computational cost for the large-scale problems, especially an ocean of samples. In this paper, we have a gradient preconditioning trick and combine it with mini-batch SGD. The proposed gradient preconditioned mini-batch SGD algorithm boosts indeed the convergence with lower computational cost than that of the data preconditioning technique for ridge regression. Concretely, we use recent random projection and linear sketching methods to randomly low rank approximate the data matrix, then we can achieve a appropriate preconditioner through numerical linear algebra. Finally, we apply obtained preconditioner to the gradient to reduce computational cost. The experimental results on both synthetic data and real data sets validate the feasibility and effectiveness of our trick and algorithm. (C) 2020 Elsevier B.V. All rights reserved.
机译:数据预处理技术,其通过数据矩阵的线性变换减少了问题的条件数量,通常用于加速定期损耗最小化的一阶优化方法的收敛。对技术的一个明显限制非常昂贵的大规模问题的计算成本,尤其是样本的海洋。在本文中,我们有一个渐变预处理技巧,并用迷你批处理SGD结合它。所提出的梯度预处理迷你批处理SGD算法确实增加了计算成本低于脊回归的数据预处理技术的收敛性。具体地,我们使用最近的随机投影和线性草图方法来随机低等级近似数据矩阵,然后我们可以通过数值线性代数来实现适当的预处理器。最后,我们将获得的预处理器应用于梯度以降低计算成本。合成数据和真实数据集的实验结果验证了我们诀窍和算法的可行性和有效性。 (c)2020 Elsevier B.v.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号