The recent surge in activity of neural network research inbusiness is not surprising since the underlying functions controllingbusiness data are generally unknown and the neural network offers atool that can approximate the unknown function to any degree ofdesired accuracy. The vast majority of these studies rely on agradient algorithm, typically a variation of backpropagation, toobtain the parameters (weights) of the model. The well-knownlimitations of gradient search techniques applied to complexnonlinear optimization problems such as artificial neural networkshave often resulted in inconsistent and unpredictable performance.Many researchers have attempted to address the problems associatedwith the training algorithm by imposing constraints on the searchspace or by restructuring the architecture of the neural network. Inthis paper we demonstrate that such constraints and restructuring areunnecessary if a sufficiently complex initial architecture and anappropriate global search algorithm is used. We further show that thegenetic algorithm cannot only serve as a global search algorithm butby appropriately defining the objective function it cansimultaneously achieve a parsimonious architecture. The value ofusing the genetic algorithm over backpropagation for neural networkoptimization is illustrated through a Monte Carlo study whichcompares each algorithm on in--sample, interpolation, andextrapolation data for seven test functions.
展开▼