We discuss the dynamics of batch learning of multilayer neural networks in the asymptotic limit, where the number of trining data is much larger than the number of parameters, emphasizing on the parameterization redundancy in overrealizable cases. In addition to showing experimental results on overtraining in multilayer perceptrons and three-layer linearneural networks, we theoretically prove the existence of overtraining in overrealizable cases of the latter model.
展开▼