在激活函数选为误差函数的条件下,给出了求Fisher信息矩阵逆的算法,并进行仿真验证.首先将自然梯度法应用于多层感知器学习中,并与BP(back-propagation)算法进行了比较,说明了自然梯度法能够加快学习速度,并且可能避免学习中出现平坦区现象,即使出现平坦区,也不会很严重.仿真验证了自然梯度法上述良好的学习性能,并且它渐近的逼近Cramer-Rao下界,从而验证了自然梯度学习是Fisher优效的,而通常的梯度法则不是Fisher优效的.另外,仿真也证实了自然梯度对训练样本中存在的噪声具有很好的鲁棒性.%The main difficulty in implementing the natural gradient is to compute the inverse of the Fisher information matrix when the dimension of input sample is large. The error function is taken as the activation function and an algorithm to compute the inverse of the Fisher information matrix was designed. The simulation to verify the correctness of this algorithm was conducted. The natural gradient descent method is applied to multi-layer perceptron, and it is compared with Back-Propagation algorithm. It is shown that natural gradient method can speed up the learning speed, and it is sometimes free from being trapped in plateaus. This suggests that natural gradient descent might eliminate such plateaus or might make them not so serious. The simulation shows that natural gradient can optimize the learning dynamics such that Cramer-Rao lower bound is achieved asymptotically. The results have confirmed the fast convergence and statistical efficiency of the natural gradient, in contrast the ordinary gradient does not have the performance. The robustness of the natural gradient descent against the additive noise in the training examples is also shown by the simulation.
展开▼