首页> 外文会议>International Conference on Neural Information Processing >Associative Memories Using Multilayer Perceptrons with 3-Valued Weights and Sparsely Interconnected Neural Networks
【24h】

Associative Memories Using Multilayer Perceptrons with 3-Valued Weights and Sparsely Interconnected Neural Networks

机译:使用具有3值的重量和稀疏互连的神经网络的多层感知者的关联记忆

获取原文

摘要

Associative memories composed of sparsely interconnected neural networks (SINNs) are suitable for hardware implementation. However, the sparsely interconnected structure also gives rise to a decrease in the capability of SINNs for associative memories. Although this problem can be solved by increasing the number of interconnections, the hardware cost goes up rapidly. Therefore, we propose associative memories using multilayer perceptrons (MLPs) with 3-valued weights and SINNs. This is because such MLPs can be realized at a lower cost than increasing interconnections in SINNs and can give each neuron in SINNs the global information of an input pattern to improve the storage capacity. Finally, it is confirmed by simulations that our proposed associative memories have good performance.
机译:由稀疏互联的神经网络(SINN)组成的关联存储器适用于硬件实现。然而,稀疏互连的结构也会导致辛诺斯对关联存储器的能力降低。虽然可以通过增加互连的数量来解决这个问题,但是硬件成本迅速上升。因此,我们提出了使用多层感知者(MLP)的关联存储器,其中3值重量和鼻窦。这是因为可以以较低的成本实现这些MLPS而不是辛诺斯中的互连,并且可以在SINNS中给出每个神经元的输入模式的全局信息来提高存储容量。最后,通过模拟确认,我们提出的联合记忆具有良好的表现。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号