【24h】

On the weight dynamics of recurrent learning

机译:递归学习的体重动态

获取原文
获取原文并翻译 | 示例

摘要

We derive continuous-time batch and online versions of the recently introduced efficient O(N~2) training algorithm of Atiya and Parlos [2000] for fully recurrent networks. A mathematical analysis of the respective weight dynamics yields that efficient learning is achieved although relative rates of weight change remain constant due to the way errors are backpropagated. The result is a highly structured network where an unspecific internal dynamical reservoir can be distinguished from the output layer, which learns faster and changes at much higher rates. We discuss this result with respect to the recently introduced "echo state" and "liquid state" networks, which have similar structure.
机译:我们推导了Atiya和Parlos [2000]针对完全循环网络最近引入的有效O(N〜2)训练算法的连续时间批处理和在线版本。对各个体重动力学的数学分析得出,尽管由于反向传播错误的方式使体重的相对变化率保持恒定,但仍可以实现有效的学习。结果是一个高度结构化的网络,其中可以从输出层中区分出一个非特定的内部动态存储库,该输出层学习速度更快,并且变化率更高。我们针对具有类似结构的最近引入的“回声状态”和“液体状态”网络讨论此结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号