首页> 外文期刊>Journal of Artificial Intelligence and Soft Computing Research >Adapting Differential Evolution Algorithms For Continuous Optimization Via Greedy Adjustment Of Control Parameters
【24h】

Adapting Differential Evolution Algorithms For Continuous Optimization Via Greedy Adjustment Of Control Parameters

机译:通过控制参数的贪婪调整来适应差分进化算法进行连续优化

获取原文
           

摘要

Differential evolution (DE) presents a class of evolutionary and meta-heuristic techniques that have been applied successfully to solve many real-world problems. However, the performance of DE is significantly influenced by its control parameters such as scaling factor and crossover probability. This paper proposes a new adaptive DE algorithm by greedy adjustment of the control parameters during the running of DE. The basic idea is to perform greedy search for better parameter assignments in successive learning periods in the whole evolutionary process. Within each learning period, the current parameter assignment and its neighboring assignments are tested (used) in a number of times to acquire a reliable assessment of their suitability in the stochastic environment with DE operations. Subsequently the current assignment is updated with the best candidate identified from the neighborhood and the search then moves on to the next learning period. This greedy parameter adjustment method has been incorporated into basic DE, leading to a new DE algorithm termed as Greedy Adaptive Differential Evolution (GADE). GADE has been tested on 25 benchmark functions in comparison with five other DE variants. The results of evaluation demonstrate that GADE is strongly competitive: it obtained the best rank among the counterparts in terms of the summation of relative errors across the benchmark functions with a high dimensionality.
机译:差分进化(DE)提出了一类进化和元启发式技术,这些技术已成功应用于解决许多现实世界中的问题。但是,DE的性能受其控制参数(例如缩放因子和交叉概率)的影响很大。通过对DE运行过程中控制参数的贪婪调整,提出了一种新的自适应DE算法。基本思想是在整个进化过程中的连续学习阶段中进行贪婪搜索,以获得更好的参数分配。在每个学习周期内,将对当前参数分配及其附近的分配进行多次测试(使用),以获取对其在DE操作的随机环境中适用性的可靠评估。随后,使用从邻域中识别出的最佳候选者更新当前作业,然后搜索进入下一个学习期。这种贪婪的参数调整方法已被纳入基本的DE中,从而产生了一种称为贪婪自适应差分进化(GADE)的新DE算法。与其他五个DE变体相比,GADE已在25个基准功能上进行了测试。评估结果表明,GADE具有很强的竞争力:它在高维度基准函数的相对误差总和方面获得了同行中的最佳排名。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号