首页> 美国卫生研究院文献>Frontiers in Synaptic Neuroscience >Re-encoding of associations by recurrent plasticity increases memory capacity
【2h】

Re-encoding of associations by recurrent plasticity increases memory capacity

机译:通过重复可塑性对关联进行重新编码可增加存储容量

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Recurrent networks have been proposed as a model of associative memory. In such models, memory items are stored in the strength of connections between neurons. These modifiable connections or synapses constitute a shared resource among all stored memories, limiting the capacity of the network. Synaptic plasticity at different time scales can play an important role in optimizing the representation of associative memories, by keeping them sparse, uncorrelated and non-redundant. Here, we use a model of sequence memory to illustrate how plasticity allows a recurrent network to self-optimize by gradually re-encoding the representation of its memory items. A learning rule is used to sparsify large patterns, i.e., patterns with many active units. As a result, pattern sizes become more homogeneous, which increases the network's dynamical stability during sequence recall and allows more patterns to be stored. Last, we show that the learning rule allows for online learning in that it keeps the network in a robust dynamical steady state while storing new memories and overwriting old ones.
机译:递归网络已被提出作为关联记忆的模型。在这种模型中,存储项以神经元之间的连接强度存储。这些可修改的连接或突触构成所有存储的内存之间的共享资源,从而限制了网络的容量。通过保持突触可塑性稀疏,不相关和不冗余,它们在优化关联记忆的表示中可以发挥重要作用。在这里,我们使用序列内存模型来说明可塑性如何通过逐渐重新编码其内存项的表示来使循环网络进行自我优化。学习规则用于稀疏大型模式,即具有许多活动单位的模式。结果,码型大小变得更加均匀,这增加了序列调用期间网络的动态稳定性,并允许存储更多的码型。最后,我们证明了学习规则可以进行在线学习,因为它可以使网络保持强大的动态稳定状态,同时可以存储新的内存并覆盖旧的内存。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号