Batch training algorithms with a different learning rate for each weight are investigated. The adaptive learning rate algorithms of this class that apply inexact one-dimensional subminimization are analyzed and their global convergence is studied. Simulations are conducted to evaluate the convergence behavior of two training algorithms of this class and to compare them with several popular training methods.
展开▼