首页> 外文期刊>The Journal of Mathematical Neuroscience >Inhomogeneous Sparseness Leads to Dynamic Instability During Sequence Memory Recall in a Recurrent Neural Network Model
【24h】

Inhomogeneous Sparseness Leads to Dynamic Instability During Sequence Memory Recall in a Recurrent Neural Network Model

机译:递归神经网络模型中不均匀的稀疏性导致序列记忆调用期间的动态不稳定

获取原文
           

摘要

Theoretical models of associative memory generally assume most of their parameters to be homogeneous across the network. Conversely, biological neural networks exhibit high variability of structural as well as activity parameters. In this paper, we extend the classical clipped learning rule by Willshaw to networks with inhomogeneous sparseness, i.e., the number of active neurons may vary across memory items. We evaluate this learning rule for sequence memory networks with instantaneous feedback inhibition and show that little surprisingly, memory capacity degrades with increased variability in sparseness. The loss of capacity, however, is very small for short sequences of less than about 10 associations. Most interestingly, we further show that, due to feedback inhibition, too large patterns are much less detrimental for memory capacity than too small patterns.
机译:关联存储器的理论模型通常假定它们的大多数参数在整个网络上都是同质的。相反,生物神经网络展现出结构和活动参数的高度可变性。在本文中,我们将Willshaw的经典限幅学习规则扩展到稀疏度不均匀的网络,即活动神经元的数量可能会因记忆项目而异。我们评估具有瞬时反馈抑制的序列记忆网络的这种学习规则,并显示出令人惊讶的是,记忆容量随着稀疏性的增加而降低。但是,对于少于约10个关联的短序列,容量损失非常小。最有趣的是,我们进一步表明,由于反馈抑制,太大的模式比太小的模式对内存容量的危害要小得多。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号