首页> 外文会议>International conference on neural information processing >Training the Hopfield Neural Network for Classification Using a STDP-Like Rule
【24h】

Training the Hopfield Neural Network for Classification Using a STDP-Like Rule

机译:使用类似STDP的规则训练Hopfield神经网络进行分类

获取原文

摘要

The backpropagation algorithm has played a critical role in training deep neural networks. Many studies suggest that the brain may implement a similar algorithm. But most of them require symmetric weights between neurons, which makes the models less biologically plausible. Inspired by some recent works by Bengio et al., we show that the well-known Hopfield neural network (HNN) can be trained in a biologically plausible way. The network can take hierarchical architectures and the weights between neurons are not necessarily symmetric. The network rims in two alternating phases. The weight change is proportional to the firing rate of the presynaptic neuron and the state (or membrane potential) change of the postsynaptic neuron between the two phases, which approximates a classical spike-timing-dependent-plasticity (STDP) rule. Several HNNs with one or two hidden layers are trained on the MNIST dataset and all of them converge to low training errors. These results further push our understanding of the brain mechanism for supervised learning.
机译:反向传播算法在训练深度神经网络中发挥了关键作用。许多研究表明,大脑可能会实现类似的算法。但是大多数模型都需要神经元之间的对称权重,这使得模型在生物学上的可信度降低。受Bengio等人最近的一些工作的启发,我们表明众所周知的Hopfield神经网络(HNN)可以以生物学上可行的方式进行训练。网络可以采用分层架构,神经元之间的权重不一定是对称的。网络在两个交替阶段中轮辋。权重变化与两个阶段之间的突触前神经元的放电速率和突触后神经元的状态(或膜电位)变化成正比,这近似于经典的“随时间变化而定的可塑性”(STDP)规则。在MNIST数据集上训练了几个具有一或两个隐藏层的HNN,它们都收敛到低训练误差。这些结果进一步推动了我们对监督学习的大脑机制的理解。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号