【24h】

Handling Expensive Optimization with Large Noise

机译:处理大噪声的昂贵优化

获取原文
获取原文并翻译 | 示例
获取外文期刊封面目录资料

摘要

We present lower and upper bounds on runtimes for expensive noisy optimization problems. Runtimes are expressed in terms of number of fitness evaluations. Fitnesses considered are monotonic transformations of the sphere function. The analysis focuses on the common case of fitness functions quadratic in the distance to the optimum in the neighbourhood of this optimum-it is nonetheless also valid for any monotonic polynomial of degree p > 2. Upper bounds are derived via a bandit-based estimation of distribution algorithm that relies on Bernstein races called R-EDA. It is an evolutionary algorithm in the sense that it is based on selection, random mutations, with a distribution (updated at each iteration) for generating new individuals. It is known that the algorithm is consistent (i.e. it converges to the optimum asymptotically in the number of examples) even in some non-differentiable cases. Here we show that: (i) if the variance of the noise decreases to 0 around the optimum, it can perform well for quadratic transformations of the norm to the optimum, (ii) otherwise, a convergence rate is slower than the one exhibited empirically by an algorithm called Quadratic Logistic Regression (QLR) based on surrogate models-although QLR requires a probabilistic prior on the fitness class.
机译:我们提出了运行时的上限和下限,以解决昂贵的噪声优化问题。运行时间以适应性评估的次数表示。考虑的适应度是球面函数的单调变换。分析的重点是适应度函数在距离最佳点较近的距离处呈二次方的常见情况-尽管如此,它对于p> 2的任何单调多项式也是有效的。依赖伯恩斯坦族的分布算法R-EDA。从某种意义上说,它是一种进化算法,它基于选择,随机突变,并具有用于生成新个体的分布(在每次迭代中更新)。众所周知,即使在某些不可微的情况下,该算法也是一致的(即,在示例数量上渐近收敛到最优值)。在这里,我们表明:(i)如果噪声方差在最佳值附近减小到0,则可以很好地将范数二次变换到最佳值;(ii)否则,收敛速度比经验值慢通过基于替代模型的称为二次逻辑回归(QLR)的算法,尽管QLR需要在适应性类别上具有概率先验。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号