首页> 外文期刊>Engineering Computations >A study on performance of MHDA in training MLPs
【24h】

A study on performance of MHDA in training MLPs

机译:MHDA训练MLPS的性能研究

获取原文
获取原文并翻译 | 示例

摘要

Purpose In recent years, the application of metaheuristics in training neural network models has gained significance due to the drawbacks of deterministic algorithms. This paper aims to propose the use of a recently developed "memory based hybrid dragonfly algorithm" (MHDA) for training multi-layer perceptron (MLP) model by finding the optimal set of weight and biases. Design/methodology/approach The efficiency of MHDA in training MLPs is evaluated by applying it to classification and approximation benchmark data sets. Performance comparison between MHDA and other training algorithms is carried out and the significance of results is proved by statistical methods. The computational complexity of MHDA trained MLP is estimated. Findings Simulation result shows that MHDA can effectively find the near optimum set of weight and biases at a higher convergence rate when compared to other training algorithms. Originality/value This paper presents MHDA as an alternative optimization algorithm for training MLP. MHDA can effectively optimize set of weight and biases and can be a potential trainer for MLPs.
机译:目的近年来,由于确定性算法的缺点,在训练神经网络模型中的应用在训练中的应用。本文旨在提出使用最近开发的“基于内存的混合蜻蜓算法”(MHDA)来训练多层Perceptron(MLP)模型,找到最佳的重量和偏差。设计/方法/方法通过将其应用于分类和近似基准数据集来评估MHDA在训练MLP中的效率。进行MHDA和其他培训算法之间的性能比较,并通过统计方法证明了结果的重要性。估计MHDA培训的MLP的计算复杂性。结果仿真结果表明,与其他训练算法相比,MHDA可以有效地以更高的收敛速度找到近的最佳重量和偏差。原始性/值本文将MHDA呈现为培训MLP的替代优化算法。 MHDA可以有效地优化一组重量和偏见,并且可以是MLP的潜在培训师。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号