The noise associated with conventional techniques for evolutionary improvement of neural network architectures is reduced so that of an optimum architecture can be determined more efficiently and more effectively. Parameters that affect the initialization of a neural network architecture are included within the encoding that is used by an evolutionary algorithm to optimize the neural network architecture. The example initialization parameters include an encoding that determines the initial nodal weights used in each architecture at the commencement of the training cycle. By including the initialization parameters within the encoding used by the evolutionary algorithm, the initialization parameters that have a positive effect on the performance of the resultant evolved network architecture are propagated and potentially improved from generation to generation. Conversely, initialization parameters that, for example, cause the resultant evolved network to be poorly trained, will not be propagated. In accordance with a second aspect of this invention, the encoding also includes parameters that affect the training process, such as the duration of the training cycle, the training inputs applied, and so on. In accordance with a third aspect of this invention, the same set of training or evaluation inputs are applied to all members whose performances are directly compared.
展开▼