...
首页> 外文期刊>Intelligent data analysis >An improved opposition based learning firefly algorithm with dragonfly algorithm for solving continuous optimization problems
【24h】

An improved opposition based learning firefly algorithm with dragonfly algorithm for solving continuous optimization problems

机译:一种改进基于对立的对抗蜻蜓算法来解决连续优化问题

获取原文
获取原文并翻译 | 示例
           

摘要

Nowadays, the existence of continuous optimization problems has led researchers to come up with a variety of methods to solve continues optimization problems. The metaheuristic algorithms are one of the most popular and common ways to solve continuous optimization problems. Firefly Algorithm (FA) is a successful metaheuristic algorithm for solving continuous optimization problems; however, although this algorithm performs very well in local search, it has weaknesses and disadvantages in finding solution in global search. This problem has caused this algorithm to be trapped locally and the balance between exploration and exploitation cannot be well maintained. In this paper, three different approaches based on the Dragonfly Algorithm (DA) processes and the OBL method are proposed to improve exploration, performance, efficiency and information-sharing of the FA and to avoid the FA getting stuck in local trap. In the first proposed method (FADA), the robust processes of DA are used to improve the exploration, performance and efficiency of the FA; and the second proposed method (OFA) uses an Opposition-Based Learning (OBL) algorithm to accelerate the convergence and exploration of the FA. Finally, in the third approach, which is referred to as OFADA in this paper, a hybridization of the hybrid FADA and the OBL method is used to improve the convergence and accuracy of the FA. The three proposed methods were implemented on functions with 2, 4, 10, and 30 dimensions. The results of the implementation of these three proposed methods showed that OFADA approach outperformed the other two proposed methods and other compared metaheuristic algorithms in different dimensions. In addition, all the three proposed methods provided better results compared with other metaheuristic algorithms on smalldimensional functions. However, performance of many metaheuristic algorithms decreased with increasing the dimensions of the functions. While the three proposed methods, in particular the OFADA approach, have been able to make better converge with the higher-dimensional optimization functions toward the target in comparison with other metaheuristic algorithms, and to show a high performance.
机译:如今,存在连续优化问题的存在,研究人员提出了各种方法来解决继续优化问题。成群质算法是解决连续优化问题的最受欢迎和常见的方法之一。萤火虫算法(FA)是一种成功的成功算法,用于解决连续优化问题;然而,尽管该算法在本地搜索中表现得非常好,但在全球搜索中找到解决方案具有缺点和缺点。此问题导致本地捕获该算法,勘探和剥削之间的平衡无法保持良好。在本文中,提出了三种基于蜻蜓算法(DA)过程和OBL方法的不同方法,提高了FA的勘探,性能,效率和信息共享,并避免了陷入困境的陷阱。在第一个提出的方法(FADA)中,DA的稳健过程用于改善FA的勘探,性能和效率;而第二个提议的方法(OFA)使用基于反对的学习(OBL)算法来加速FA的收敛和探索。最后,在本文中称为奥达的第三种方法,使用杂交FADA的杂交和屈服方法来改善FA的收敛性和准确性。三种提出的方​​法在具有2,4,10和30维的功能上实施。实施这三种拟议方法的实施结果表明,OFADA方法优于其他两个提出的方法和其他比较的不同尺寸的比较的成群质算法。此外,与其他成群质算法相比,所有三种所提出的方法都提供了更好的结果。然而,随着增加功能的尺寸,许多成分型算法的性能降低。虽然三种所提出的方法,特别是奥达方法,但与其他成帧算法相比,朝向目标的高维优化功能能够更好地收敛,并显示出高性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号