首页> 外文期刊>IEEE Transactions on Emerging Topics in Computational Intelligence >Adaptive Evolution Strategies for Stochastic Zeroth-Order Optimization
【24h】

Adaptive Evolution Strategies for Stochastic Zeroth-Order Optimization

机译:Adaptive Evolution Strategies for Stochastic Zeroth-Order Optimization

获取原文
获取原文并翻译 | 示例
           

摘要

We consider solving a class of unconstrained optimization problems in which only stochastic estimates of the objective functions are available. Existing stochastic optimization methods are mainly extended from gradient-based methods, faced with the challenges of noisy function evaluations, hardness in choosing step-sizes, and probably ill-conditioned landscapes. This paper presents a stochastic evolution strategy (SES) framework and several adaptation schemes to avoid these challenges. The SES framework combines the ideas of population sampling and minibatch sampling in exploiting the zeroth-order gradient information, efficiently reducing the noise in both data selection and gradient approximation. In addition, it admits approximating the gradients using a non-isotropic Gaussian distribution to better capture the curvature information of the landscapes. Based on this framework, we implement a step-size adaptation rule and two covariance matrix adaptation rules, where the former can automatically tune the step-sizes and the latter are intended to cope with ill-conditioning. For SES with certain fixed step-sizes, we establish a nearly optimal convergence rate over smooth landscapes. We also show that using the adaptive step-sizes allows convergence at a slightly slower rate but without the need to know the smoothness constant. Several numerical experiments on machine learning problems verify the above theoretical results and suggest that the adaptive SES methods show much promise.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号