首页> 外文会议> >Backpropagation-decorrelation: online recurrent learning with O(N) complexity
【24h】

Backpropagation-decorrelation: online recurrent learning with O(N) complexity

机译:反向传播-去相关:具有O(N)复杂度的在线循环学习

获取原文

摘要

We introduce a new learning rule for fully recurrent neural networks which we call backpropagation-decorrelation rule (BPDC). It combines important principles: one-step backpropagation of errors and the usage of temporal memory in the network dynamics by means of decorrelation of activations. The BPDC rule is derived and theoretically justified from regarding learning as a constraint optimization problem and applies uniformly in discrete and continuous time. It is very easy to implement, and has a minimal complexity of 2N multiplications per time-step in the single output case. Nevertheless we obtain fast tracking and excellent performance in some benchmark problems including the Mackey-Glass time-series.
机译:我们为全递归神经网络引入了一种新的学习规则,称为反向传播-去相关规则(BPDC)。它结合了重要原理:错误的单步反向传播以及通过激活的去相关手段在网络动态中使用时间内存。 BPDC规则是从将学习视为约束优化问题而得出的,并且在理论上是合理的,并且在离散和连续时间内均等地应用。它非常易于实现,在单个输出情况下,每个时间步的最小复杂度为2N倍。但是,我们在包括Mackey-Glass时间序列在内的一些基准测试问题中获得了快速跟踪和出色的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号