...
首页> 外文期刊>Biological Cybernetics >Binary Willshaw learning yields high synaptic capacity for long-term familiarity memory
【24h】

Binary Willshaw learning yields high synaptic capacity for long-term familiarity memory

机译:二进制Willshaw学习为长期熟悉记忆产生高突触能力

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

In this study, we investigate from a computational perspective the efficiency of the Willshaw synaptic update rule in the context of familiarity discrimination, a binary-answer, memory-related task that has been linked through psychophysical experiments with modified neural activity patterns in the prefrontal and perirhinal cortex regions. Our motivation for recovering this well-known learning prescription is two-fold: first, the switch-like nature of the induced synaptic bonds, as there is evidence that biological synaptic transitions might occur in a discrete stepwise fashion. Second, the possibility that in the mammalian brain, unused, silent synapses might be pruned in the long-term. Besides the usual pattern and network capacities, we calculate the synaptic capacity of the model, a recently proposed measure where only the functional subset of synapses is taken into account. We find that in terms of network capacity, Willshaw learning is strongly affected by the pattern coding rates, which have to be kept fixed and very low at any time to achieve a non-zero capacity in the large network limit. The information carried per functional synapse, however, diverges and is comparable to that of the pattern association case, even for more realistic moderately low activity levels that are a function of network size.
机译:在这项研究中,我们从计算的角度研究了在熟悉性歧视的背景下威尔肖突触更新规则的效率,这是一种二进制答案,与记忆有关的任务,已通过心理物理实验与前额叶和前额叶中的神经活动模式的改进相联系。周围皮层区域。恢复这种众所周知的学习处方的动机有两个方面:第一,诱导的突触键的开关状性质,因为有证据表明生物突触过渡可能以离散的逐步方式发生。第二,从长期来看,哺乳动物大脑中未使用的沉默突触可能会被修剪。除了通常的模式和网络容量之外,我们还计算模型的突触容量,这是最近提出的一种措施,其中仅考虑了突触的功能子集。我们发现,就网络容量而言,Willshaw学习受到模式编码率的强烈影响,模式编码率必须保持固定且在任何时候都必须非常低,以在大型网络限制中实现非零容量。但是,即使对于更实际的适度低活动级别(取决于网络大小),每个功能突触所携带的信息也有所不同,并且可以与模式关联情况相媲美。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号