首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Exponentiated Gradient Meets Gradient Descent
【24h】

Exponentiated Gradient Meets Gradient Descent

机译:指数梯度与梯度下降相遇

获取原文
           

摘要

The (stochastic) gradient descent and the multiplicative update method are probably the most popular algorithms in machine learning. We introduce and study a new regularization which provides a unification of the additive and multiplicative updates. This regularization is derived from an hyperbolic analogue of the entropy function, which we call hypentropy. It is motivated by a natural extension of the multiplicative update to negative numbers. The hypentropy has a natural spectral counterpart which we use to derive a family of matrix-based updates that bridge gradient methods and the multiplicative method for matrices. While the latter is only applicable to positive semi-definite matrices, the spectral hypentropy method can naturally be used with general rectangular matrices. We analyze the new family of updates by deriving tight regret bounds. We study empirically the applicability of the new update for settings such as multiclass learning, in which the parameters constitute a general rectangular matrix.
机译:(随机)梯度下降和乘性更新方法可能是机器学习中最流行的算法。我们引入并研究了新的正则化,它提供了加性和乘性更新的统一。这种正则化来自熵函数的双曲类似物,我们称其为hypentropy。它是自然地将乘法更新扩展为负数的结果。催眠法有一个自然的光谱对应物,我们用它来推导出一系列基于矩阵的更新,这些更新将梯度方法和矩阵的乘法方法联系起来。尽管后者仅适用于正半定矩阵,但频谱hypentropy方法自然可以用于一般的矩形矩阵。我们通过得出严格的后悔界限来分析新的更新家族。我们凭经验研究新更新对诸如多类学习之类的设置的适用性,其中参数构成一个通用的矩形矩阵。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号