...
首页> 外文期刊>PLoS Computational Biology >Memory Capacity of Networks with Stochastic Binary Synapses
【24h】

Memory Capacity of Networks with Stochastic Binary Synapses

机译:具有随机二进制突触的网络的内存容量

获取原文
           

摘要

In standard attractor neural network models, specific patterns of activity are stored in the synaptic matrix, so that they become fixed point attractors of the network dynamics. The storage capacity of such networks has been quantified in two ways: the maximal number of patterns that can be stored, and the stored information measured in bits per synapse. In this paper, we compute both quantities in fully connected networks of N binary neurons with binary synapses, storing patterns with coding level , in the large and sparse coding limits (). We also derive finite-size corrections that accurately reproduce the results of simulations in networks of tens of thousands of neurons. These methods are applied to three different scenarios: (1) the classic Willshaw model, (2) networks with stochastic learning in which patterns are shown only once (one shot learning), (3) networks with stochastic learning in which patterns are shown multiple times. The storage capacities are optimized over network parameters, which allows us to compare the performance of the different models. We show that finite-size effects strongly reduce the capacity, even for networks of realistic sizes. We discuss the implications of these results for memory storage in the hippocampus and cerebral cortex.
机译:在标准吸引子神经网络模型中,特定的活动模式存储在突触矩阵中,因此它们成为网络动力学的定点吸引子。此类网络的存储容量已通过两种方式进行了量化:可以存储的最大模式数量,以及每个突触以位为单位测量的存储信息。在本文中,我们在具有二元突触的N个二元神经元的完全连接网络中计算这两个数量,并在较大和稀疏的编码限制()中存储编码级别的模式。我们还导出了有限大小的校正,该校正可准确地重现成千上万个神经元网络中的模拟结果。这些方法适用于三种不同的情况:(1)经典的Willshaw模型,(2)具有随机学习的网络,其中模式仅显示一次(一次射击学习),(3)具有随机学习的网络,其中模式中显示多个模式次。通过网络参数优化了存储容量,这使我们可以比较不同模型的性能。我们表明,即使对于实际大小的网络,有限大小的效果也会大大降低容量。我们讨论了这些结果对海马和大脑皮层记忆存储的影响。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号