首页> 外文会议>International Joint Conference on Neural Networks >Topology optimization for artificial neural networks using differential evolution
【24h】

Topology optimization for artificial neural networks using differential evolution

机译:使用差分演进的人工神经网络拓扑优化

获取原文

摘要

Backpropagation (BP) training algorithm is the main algorithm for training feedforward artificial neural networks (ANNs). BP is based on gradient descent, thus it converges to a local optimum in the region of the initial solution. Meanwhile, the evolutionary algorithms (EAs) always look for global optimum, however their ability of local search is not as good as the BP algorithm. This paper presents a hybrid system that uses differential evolution with global and local neighborhoods (DEGL), which is a variant of differential evolution (DE), to search for a suitable architecture and a near-optimal set of initial connection weights, and then performs the Levenberg-Marquadt training algorithm, which is a more robust variation of BP, to perform local search from these initial weights. Finally, it is performed a comparison of the performance of the hybrid system DEGL+ANN with the hybrid system DE+ANN and the raw RNA, for classification problems using machine learning benchmarks.
机译:BackPropagation(BP)训练算法是培训前馈人工神经网络(ANN)的主要算法。 BP基于梯度下降,因此它会聚到初始解决方案区域中的局部最佳。同时,进化算法(EAS)始终寻找全球最佳,但它们的本地搜索能力与BP算法不如BP算法。本文介绍了一种混合系统,使用全局和本地社区(DEGL)的差分演进,这是差分演进(de)的变体,以搜索合适的架构和近最佳的初始连接权重集,然后执行Levenberg-Marquadt训练算法,这是BP的更强大变化,用于从这些初始权重执行本地搜索。最后,对使用机器学习基准进行分类问题的混合系统DEGL + ANN的性能进行比较,用于使用机器学习基准。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号