In recent years, considerable progress in neural network research have been made and applications of neural networks have been extended to a large number of different disciplines, including biology, psychology, physics, mathematics, statistics, engineering, operations research and computer science. The result is a very interdisciplinary and inspiring research area, and of course a variety of different terminologies, concepts and notations. Also there has been increasing interest in building a complex neural network which consist of various modular neural subnetworks to carry out increasingly sophisticated tasks.; This dissertation is devoted to the development of a unified mathematical theory of neural learning to form the basis for describing the network topology, for constructing learning rules by nonsmooth analysis and a dynamical system approach to constrained optimization, and for synthesizing hierarchically organized modular neural networks. This dissertation also illustrates applications of neural networks to forecasting time series and physical mapping in molecular biology.; By introducing a neural learning function for various artificial neural network models, learning is represented as an optimization process subject to constraints inherent in a particular problem. Transforming a constrained optimization problem into an unconstrained optimization problem through Lagrange multipliers, for example, may lead to a nonsmooth learning function. Nonsmooth analysis and a differential inclusion for solving nonsmooth optimization problems are introduced to form the theoretical basis for constructing learning rules. Several stability theorems for a learning process are proven to guarantee that the learning function is minimized as the neural network evolves, i.e. the artificial network performs optimally. A neural network model based on differential-algebraic equations (DAEs) is also proposed. The global and local convergence properties of neural learning algorithms for constrained optimization problems are analyzed. Simulations of neutral network models for constrained optimization problems are carried out. It is demonstrated that neural network models may provide a good alternative method to solving constrained optimization problems.; As an application, a general nonlinear autoregressive-integrated moving average model based on neural networks is proposed for forecasting time series and a new dynamic backpropagation learning procedure is introduced. The results of a forecasting competition between a neural network model and a Box-Jenking forecasting method are also presented. Simulation results on several examples reveal that forecasting by the neural network model outperforms the Box-Jenkins model in terms of mean absolute error and mean percentage forecast error.; The neural learning theory is also applied to a physical mapping problem in molecular biology and an attempt to prove the consistency of a neural learning algorithm in this specific setting is established.
展开▼