...
首页> 外文期刊>Neural Networks and Learning Systems, IEEE Transactions on >Nonbinary Associative Memory With Exponential Pattern Retrieval Capacity and Iterative Learning
【24h】

Nonbinary Associative Memory With Exponential Pattern Retrieval Capacity and Iterative Learning

机译:具有指数模式检索能力和迭代学习的非二进制联想记忆

获取原文
获取原文并翻译 | 示例

摘要

We consider the problem of neural association for a network of nonbinary neurons. Here, the task is to first memorize a set of patterns using a network of neurons whose states assume values from a finite number of integer levels. Later, the same network should be able to recall the previously memorized patterns from their noisy versions. Prior work in this area consider storing a finite number of purely random patterns, and have shown that the pattern retrieval capacities (maximum number of patterns that can be memorized) scale only linearly with the number of neurons in the network. In our formulation of the problem, we concentrate on exploiting redundancy and internal structure of the patterns to improve the pattern retrieval capacity. Our first result shows that if the given patterns have a suitable linear-algebraic structure, i.e., comprise a subspace of the set of all possible patterns, then the pattern retrieval capacity is exponential in terms of the number of neurons. The second result extends the previous finding to cases where the patterns have weak minor components, i.e., the smallest eigenvalues of the correlation matrix tend toward zero. We will use these minor components (or the basis vectors of the pattern null space) to increase both the pattern retrieval capacity and error correction capabilities. An iterative algorithm is proposed for the learning phase, and two simple algorithms are presented for the recall phase. Using analytical methods and simulations, we show that the proposed methods can tolerate a fair amount of errors in the input while being able to memorize an exponentially large number of patterns.
机译:我们考虑非二进制神经元网络的神经关联问题。在这里,任务是首先使用神经元网络记忆一组模式,这些神经元的状态采用有限数量的整数级别的值。以后,同一网络应该能够从其嘈杂的版本中恢复以前记忆的模式。该领域的现有工作考虑存储有限数量的纯随机模式,并且已经表明,模式检索能力(可以存储的最大模式数量)仅与网络中神经元的数量呈线性比例关系。在提出问题时,我们专注于利用模式的冗余和内部结构来提高模式检索能力。我们的第一个结果表明,如果给定的模式具有合适的线性代数结构,即包括所有可能模式的子空间,则模式检索能力就神经元数量而言是指数级的。第二个结果将先前的发现扩展到模式具有较弱的次要分量的情况,即相关矩阵的最小特征值趋于零。我们将使用这些次要成分(或模式零空间的基向量)来增加模式检索能力和纠错能力。针对学习阶段提出了一种迭代算法,针对召回阶段提出了两种简单的算法。使用分析方法和模拟,我们表明,所提出的方法可以容忍输入中的大量错误,同时能够存储大量的模式。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号