首页> 外文会议>IEEE Congress on Evolutionary Computation >SO-MODS: Optimization for high dimensional computationally expensive multi-modal functions with surrogate search
【24h】

SO-MODS: Optimization for high dimensional computationally expensive multi-modal functions with surrogate search

机译:SO-MODS:使用代理搜索优化高维计算量大的多模态函数

获取原文

摘要

SO-MODS is a new algorithm that combines surrogate global optimization methods with local search. SO-MODS is an extension of prior algorithms that sought to find near optimal solutions for computationally very expensive functions for which the number of allowable evaluations is strictly limited. The global search method in SO-MODS perturbs the best point found so far in order to find a new sample point. The number of decision variables being perturbed is dynamically adjusted in each iteration in order to be more effective for higher dimensional problems. The procedure for dynamically changing the dimensions perturbed is drawn from earlier work on the DYCORS algorithm. We use a cubic radial basis function as surrogate model and investigate two approaches to improve the solution accuracy. The numerical results show that SO-MODS is able to reduce the objective function value dramatically with just a few hundred evaluations even for 30-dimensional problems. The local search is then able to reduce the objective function value further.
机译:SO-MODS是一种将替代全局优化方法与局部搜索相结合的新算法。 SO-MODS是现有算法的扩展,该算法试图为计算上非常昂贵的函数找到接近最佳的解决方案,对于这些函数,严格评估的数量受到严格限制。 SO-MODS中的全局搜索方法会扰乱到目前为止找到的最佳点,以便找到新的采样点。在每次迭代中都会动态调整要干扰的决策变量的数量,以便更有效地解决高维问题。动态更改尺寸的过程是从DYCORS算法的早期工作中得出的。我们使用三次径向基函数作为代理模型,并研究了两种方法来提高求解精度。数值结果表明,即使对于30维问题,SO-MODS只需进行几百次评估就能够显着降低目标函数值。然后,本地搜索能够进一步减小目标函数值。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号