首页> 外文期刊>Pattern recognition letters >A continuation approach for training Artificial Neural Networks with meta-heuristics
【24h】

A continuation approach for training Artificial Neural Networks with meta-heuristics

机译:一种用元启发式方法训练人工神经网络的连续方法

获取原文
获取原文并翻译 | 示例
           

摘要

Artificial Neural Networks research field is among the areas of major activity in Artificial Intelligence. Training a neural network is an NP-hard optimization problem that presents several theoretical and computational limitations. In optimization, continuation refers to an homotopy transformation of the fitness function that is used to obtain simpler versions of such fitness function and improve convergence. In this paper we propose an approach for Artificial Neural Network training based on optimization by continuation and meta-heuristic algorithms. The goal is to reduce overall execution time of training without causing negative effects in accuracy. We use continuation together with Particle Swarm Optimization, Firefly Algorithm and Cuckoo Search for training neural networks on public benchmark datasets. The continuation variations of the studied meta-heuristic algorithms reduce execution time required to complete training in about 5-30% without statistically significant loss of accuracy when compared with standard variations of the meta-heuristics. (C) 2019 Elsevier B.V. All rights reserved.
机译:人工神经网络研究领域是人工智能的主要活动领域。训练神经网络是一个NP难题,存在一些理论和计算方面的限制。在优化中,连续是指适应度函数的同伦变换,用于获得该适应度函数的更简单版本并改善收敛。在本文中,我们提出了一种基于连续优化和元启发式算法的人工神经网络训练方法。目的是减少培训的总执行时间,而不会造成准确性方面的负面影响。我们将连续性与粒子群优化,萤火虫算法和布谷鸟搜索结合使用,以在公共基准数据集上训练神经网络。与元启发式算法的标准变体相比,所研究的元启发式算法的连续变体将完成训练所需的执行时间减少了约5%至30%,而准确性没有统计学上的显着损失。 (C)2019 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号