首页> 外文期刊>Neurocomputing >Online stability of backpropagation-decorrelation recurrent learning
【24h】

Online stability of backpropagation-decorrelation recurrent learning

机译:反向传播-去相关递归学习的在线稳定性

获取原文
获取原文并翻译 | 示例

摘要

We provide a stability analysis based on nonlinear feedback theory for the recently introduced backpropagation-decorrelation (BPDC) recurrent learning algorithm which adapts only the output weights of a possibly large network and therefore can learn in O(N). Using a small gain criterion, we derive a simple sufficient stability inequality. The condition can be monitored online to assure that the recurrent network is stable and can in principle be applied to any network adapting only the output weights. Based on these results the BPDC learning is further enhanced with an efficient online rescaling algorithm to stabilize the network while adapting. In simulations we find that this mechanism improves learning in the provably stable domain.
机译:我们为最近引入的反向传播-去相关(BPDC)递归学习算法提供了基于非线性反馈理论的稳定性分析,该算法仅适应可能的大型网络的输出权重,因此可以在O(N)中学习。使用小增益准则,我们得出一个简单的足够的稳定性不等式。可以在线监控该状况,以确保循环网络稳定,并且原则上可以应用于仅适应输出权重的任何网络。基于这些结果,BPDC学习将通过有效的在线重新缩放算法得到进一步增强,以在适应网络的同时稳定网络。在模拟中,我们发现这种机制可改善可证明稳定域中的学习。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号