【24h】

Dimension-Free Exponentiated Gradient

机译:无维指数梯度

获取原文

摘要

I present a new online learning algorithm that extends the exponentiated gradient framework to infinite dimensional spaces. My analysis shows that the algorithm is implicitly able to estimate the L_2 norm of the unknown competitor, U, achieving a regret bound of the order of (O)(U log(U T + 1))T~(1/2)), instead of the standard (O)((U~2 + 1)T~(1/2)), achievable without knowing U. For this analysis, I introduce novel tools for algorithms with time-varying regularizes, through the use of local smoothness. Through a lower bound, I also show that the algorithm is optimal up to log(UT)~(1/2) term for linear and Lipschitz losses.
机译:我介绍了一种新的在线学习算法,将指数化渐变框架扩展为无限尺寸空间。我的分析表明,该算法隐含能够估计未知竞争对手U的L_2标准,从而实现(o)(U log(ut + 1))t〜(1/2))的遗憾界限,而不是标准(o)((U〜2 + 1)t〜(1/2)),在不知道U的情况下实现。对于此分析,我通过使用本地的时变正规的算法引入了新颖的工具平滑。通过较低的界限,我还表明该算法最优为Log(UT)〜(1/2)术语,用于线性和嘴唇填充损失。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号