...
首页> 外文期刊>Neural computation >Memory Dynamics in Attractor Networks with Saliency Weights
【24h】

Memory Dynamics in Attractor Networks with Saliency Weights

机译:具有显着权重的吸引者网络中的内存动力学

获取原文
获取原文并翻译 | 示例
           

摘要

Memory is a fundamental part of computational systems like the human brain. Theoretical models identify memories as attractors of neural network activity patterns based on the theory that attractor (recurrent) neural networks are able to capture some crucial characteristics of memory, such as encoding, storage, retrieval, and long-term and working memory. In such networks, long-term storage of the memory patterns is enabled by synaptic strengths that are adjusted according to some activity-dependent plasticity mechanisms (of which the most widely recognized is the Heb-bian rule) such that the attractors of the network dynamics represent the stored memories. Most of previous studies on associative memory are focused on Hopfield-like binary networks, and the learned patterns are often assumed to be uncorrelated in a way that minimal interactions between memories are facilitated. In this letter, we restrict our attention to a more biological plausible attractor network model and study the neuronal representations of correlated patterns. We have examined the role of saliency weights in memory dynamics. Our results demonstrate that the retrieval process of the memorized patterns is characterized by the saliency distribution, which affects the landscape of the attractors. We have established the conditions that the network state converges to unique memory and multiple memories. The analytical result also holds for other cases for variable coding levels and nonbinary levels, indicating a general property emerging from correlated memories.The analytical result also holds for other cases for variable coding levels and nonbinary levels, indicating a general property emerging from correlated memories. Our resultsrnconfirmed the advantage of computing with graded-response neurons over binary neurons (i.e., reducing of spurious states). It was also found that the nonuniform saliency distribution can contribute to disappearance of spurious states when they exit.
机译:内存是人类大脑等计算系统的基本组成部分。理论模型基于吸引者(递归)神经网络能够捕获内存的某些关键特征(例如编码,存储,检索以及长期和工作记忆)的理论,将记忆识别为神经网络活动模式的吸引者。在这样的网络中,记忆模式的长期存储是通过根据某些与活动有关的可塑性机制(其中最广泛认可的是希伯来定律)进行调节的突触强度来实现的,从而使网络动力学的吸引者代表存储的内存。先前有关联想记忆的大多数研究都集中在类似Hopfield的二进制网络上,并且通常认为所学习的模式是不相关的,从而促进了记忆之间的最小交互。在这封信中,我们将注意力集中在更具生物学意义的吸引子网络模型上,并研究相关模式的神经元表示。我们已经研究了显着权重在记忆动力学中的作用。我们的结果表明,记忆模式的检索过程以显着性分布为特征,这会影响吸引子的景观。我们已经建立了网络状态收敛到唯一内存和多个内存的条件。对于可变编码水平和非二进制水平的其他情况,分析结果也成立,这表明从相关存储器出现的一般性质。对于可变编码水平和非二进制水平的其他情况,分析结果也成立,表明来自相关存储器的普遍性质。我们的结果证实了使用渐变响应神经元进行计算比使用二进制神经元进行计算的优势(即减少了杂散状态)。还发现,不均匀的显着性分布可能会导致虚假状态退出时消失。

著录项

  • 来源
    《Neural computation》 |2010年第7期|P.1899-1926|共28页
  • 作者

    Huajin Tang; Haizhou Li; Rui Yan;

  • 作者单位

    Institute for Infocomm Research, Agency for Science Technology and Research,Singapore 138632;

    Institute for Infocomm Research, Agency for Science Technology and Research,Singapore 138632, and Department of Computer Science and Statistics,University of Eastern Finland, 80101 joensuu, Finland;

    Institute for Infocomm Research, Agency for Science Technology and Research,Singapore 138632;

  • 收录信息 美国《科学引文索引》(SCI);美国《化学文摘》(CA);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号