首页> 外文会议>Pacific-Asia conference on knowledge discovery and data mining >A Learning-Rate Schedule for Stochastic Gradient Methods to Matrix Factorization
【24h】

A Learning-Rate Schedule for Stochastic Gradient Methods to Matrix Factorization

机译:矩阵因子分解的随机梯度方法的学习速率表

获取原文

摘要

Stochastic gradient methods are effective to solve matrix factorization problems. However, it is well known that the performance of stochastic gradient method highly depends on the learning rate schedule used; a good schedule can significantly boost the training process. In this paper, motivated from past works on convex optimization which assign a learning rate for each variable, we propose a new schedule for matrix factorization. The experiments demonstrate that the proposed schedule leads to faster convergence than existing ones. Our schedule uses the same parameter on all data sets included in our experiments; that is, the time spent on learning rate selection can be significantly reduced. By applying this schedule to a state-of-the-art matrix factorization package, the resulting implementation outperforms available parallel matrix factorization packages.
机译:随机梯度法可有效解决矩阵分解问题。但是,众所周知,随机梯度法的性能在很大程度上取决于所使用的学习速率表。良好的日程安排可以极大地促进培训过程。在本文中,基于以往关于凸优化的工作(为每个变量分配学习率),我们提出了矩阵分解的新时间表。实验表明,所提出的进度表比现有进度表具有更快的收敛速度。我们的时间表对实验中包含的所有数据集使用相同的参数;即,可以显着减少花费在学习速率选择上的时间。通过将此计划应用到最新的矩阵分解程序包中,最终的实现将优于可用的并行矩阵分解程序包。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号