...
首页> 外文期刊>Decision Support Systems >Toward global optimization of neural networks : A comparison of the genetic algorithm and backpropagation
【24h】

Toward global optimization of neural networks : A comparison of the genetic algorithm and backpropagation

机译:Toward global optimization of neural networks : A comparison of the genetic algorithm and backpropagation

获取原文
获取原文并翻译 | 示例
           

摘要

The recent surge in activity of neural network research inbusiness is not surprising since the underlying functions controllingbusiness data are generally unknown and the neural network offers atool that can approximate the unknown function to any degree ofdesired accuracy. The vast majority of these studies rely on agradient algorithm, typically a variation of backpropagation, toobtain the parameters (weights) of the model. The well-knownlimitations of gradient search techniques applied to complexnonlinear optimization problems such as artificial neural networkshave often resulted in inconsistent and unpredictable performance.Many researchers have attempted to address the problems associatedwith the training algorithm by imposing constraints on the searchspace or by restructuring the architecture of the neural network. Inthis paper we demonstrate that such constraints and restructuring areunnecessary if a sufficiently complex initial architecture and anappropriate global search algorithm is used. We further show that thegenetic algorithm cannot only serve as a global search algorithm butby appropriately defining the objective function it cansimultaneously achieve a parsimonious architecture. The value ofusing the genetic algorithm over backpropagation for neural networkoptimization is illustrated through a Monte Carlo study whichcompares each algorithm on in--sample, interpolation, andextrapolation data for seven test functions.

著录项

获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号