首页> 外文会议>Computational Neuroscience Meeting >Slow stochastic learning with global inhibition: a biological solution to the binary perceptron problem
【24h】

Slow stochastic learning with global inhibition: a biological solution to the binary perceptron problem

机译:随着全球抑制的缓慢学习:BINAREPTRON问题的生物解决方案

获取原文

摘要

Networks of neurons connected by plastic all-or-none synapses tend to quickly forget previously acquired information when new patterns are learned. This problem could be solved for random uncorrelated patterns by randomly selecting a small fraction of synapses to be modified upon each stimulus presentation (slow stochastic learning). Here we show that more complex, but still linearly separable patterns, can be learned by networks with binary excitatory synapses in a finite number of presentations provided that: (1) there is non-vanishing global inhibition, (2) the binary synapses are changed with small enough probability (slow learning) only when the output neuron does not give the desired response (as in the classical perceptron rule) and (3) the neuronal threshold separating the total synaptic inputs corresponding to different classes is small enough.
机译:通过塑料全部或无突触连接的神经元网络倾向于在学习新模式时快速忘记先前获取的信息。通过随机选择要在每个刺激呈现(缓慢随机学习)上进行修改的小部分突触,可以解决这种问题。在这里,我们显示更复杂,但仍然是线性可分离的模式,可以通过有限数量的演示文稿中具有二进制兴奋性突触的网络来学习,提供:(1)存在非消失的全局禁止,(2)改变二进制突触只有在输出神经元不给出所需的响应时(如在经典的Perceptron规则中)和(3)分离对应于不同类别的总突触输入的神经元阈值足够小。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号