首页> 外文会议>International Conference on Machine Learning >Characterization of Convex Objective Functions and Optimal Expected Convergence Rates for SGD
【24h】

Characterization of Convex Objective Functions and Optimal Expected Convergence Rates for SGD

机译:SGD的凸面目标函数和最佳预期收敛率的表征

获取原文

摘要

We study Stochastic Gradient Descent (SGD) with diminishing step sizes for convex objective functions. We introduce a definitional framework and theory that defines and characterizes a core property, called curvature, of convex objective functions. In terms of curvature we can derive a new inequality that can be used to compute an optimal sequence of diminishing step sizes by solving a differential equation. Our exact solutions confirm known results in literature and allows us to fully characterize a new regularizer with its corresponding expected convergence rates.
机译:我们研究随机梯度下降(SGD),其步骤尺寸减少,用于凸起的物镜功能。我们介绍了定义框架和理论,该理论定义和表征称为曲率的核心属性,称为凸起的目标函数。在曲率方面,我们可以通过求解微分方程来得出可以使用的新不等式来计算逐步缩短步骤尺寸的最佳序列。我们的确切解决方案在文献中确认了已知结果,并允许我们充分以其对应的预期收敛速率进行全面的新规范器。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号