首页> 外文会议>International Conference on Neural Information Processing >Training the Hopfield Neural Network for Classification Using a STDP-Like Rule
【24h】

Training the Hopfield Neural Network for Classification Using a STDP-Like Rule

机译:使用STDP规则培训Hopfield神经网络进行分类

获取原文

摘要

The backpropagation algorithm has played a critical role in training deep neural networks. Many studies suggest that the brain may implement a similar algorithm. But most of them require symmetric weights between neurons, which makes the models less biologically plausible. Inspired by some recent works by Bengio et al., we show that the well-known Hopfield neural network (HNN) can be trained in a biologically plausible way. The network can take hierarchical architectures and the weights between neurons are not necessarily symmetric. The network runs in two alternating phases. The weight change is proportional to the firing rate of the presynaptic neuron and the state (or membrane potential) change of the postsynaptic neuron between the two phases, which approximates a classical spike-timing-dependent-plasticity (STDP) rule. Several HNNs with one or two hidden layers are trained on the MNIST dataset and all of them converge to low training errors. These results further push our understanding of the brain mechanism for supervised learning.
机译:BackProjagation算法在培训深神经网络中发挥了关键作用。许多研究表明大脑可以实现类似的算法。但是,它们中的大多数都需要神经元之间的对称重量,这使得模型更易于生物合理的。灵感来自Bengio等人的一些最新作品,我们表明,众所周知的Hopfield神经网络(HNN)可以以生物合理的方式训练。网络可以采用分层架构,神经元之间的重量不一定是对称的。网络以两个交替阶段运行。重量变化与突触前神经元的烧制率和两个阶段之间的突触神经元的射击率和状态(或膜电位)的变化成比例,这两个阶段之间的突触神经元的变化,其近似于普通尖峰定时依赖性塑性(STDP)规则。具有一个或两个隐藏层的几个HNN在Mnist DataSet上培训,所有这些都培训,所有这些都会收敛到低训练错误。这些结果进一步推动了我们对监督学习的大脑机制的理解。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号