...
首页> 外文期刊>IEEE Transactions on Neural Networks >Asymptotic distributions associated to Oja's learning equation for neural networks
【24h】

Asymptotic distributions associated to Oja's learning equation for neural networks

机译:与神经网络的Oja学习方程相关的渐近分布

获取原文
获取原文并翻译 | 示例

摘要

We perform a complete asymptotic performance analysis of the stochastic approximation algorithm (denoted subspace network learning algorithm) derived from Oja's learning equation, in the case where the learning rate is constant and a large number of patterns is available. This algorithm drives the connection weight matrix W to an orthonormal basis of a dominant invariant subspace of a covariance matrix. Our approach consists in associating to this algorithm a second stochastic approximation algorithm that governs the evolution of WW/sup T/ to the projection matrix onto this dominant invariant subspace. Then, using a general result of Gaussian approximation theory, we derive the asymptotic distribution of the estimated projection matrix. Closed form expressions of the asymptotic covariance of the projection matrix estimated by the SNL algorithm, and by the smoothed SNL algorithm that we introduce, are given in the case of independent or correlated learning patterns and are further analyzed. It is found that the structures of these asymptotic covariance matrices are similar to those describing batch estimation techniques. The accuracy or our asymptotic analysis is checked by numerical simulations and it is found to be valid not only for a "small" learning rate but in a very large domain. Finally, improvements brought by our smoothed SNL algorithm are shown, such as the learning speed/misadjustment tradeoff and the deviation from orthonormality.
机译:在学习率恒定且有大量模式可用的情况下,我们对从Oja的学习方程式得出的随机逼近算法(表示为子空间网络学习算法)进行完整的渐近性能分析。该算法将连接权重矩阵W驱动到协方差矩阵的显性不变子空间的正交基础上。我们的方法包括将第二个随机近似算法与该算法相关联,该算法控制WW / sup T /到该主要不变子空间上的投影矩阵的演化。然后,使用高斯逼近理论的一般结果,得出估计投影矩阵的渐近分布。在独立或相关学习模式的情况下,给出了由SNL算法以及我们介绍的平滑SNL算法估计的投影矩阵的渐近协方差的闭式表达式,并对其进行了分析。发现这些渐近协方差矩阵的结构与描述批估计技术的结构相似。通过数值模拟检查了精度或我们的渐近分析,发现它不仅对于“小的”学习率而且在很大的范围内都是有效的。最后,显示了我们的平滑SNL算法带来的改进,例如学习速度/失调折衷以及与正交性的偏差。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号