首页> 外文期刊>Frontiers in Neuroscience >ReStoCNet: Residual Stochastic Binary Convolutional Spiking Neural Network for Memory-Efficient Neuromorphic Computing
【24h】

ReStoCNet: Residual Stochastic Binary Convolutional Spiking Neural Network for Memory-Efficient Neuromorphic Computing

机译:ReStoCNet:记忆有效的神经形态计算的残差随机二进制卷积穗状神经网络

获取原文
       

摘要

In this work, we propose ReStoCNet, a residual stochastic multilayer convolutional Spiking Neural Network (SNN) composed of binary kernels, to reduce the synaptic memory footprint and enhance the computational efficiency of SNNs for complex pattern recognition tasks. ReStoCNet consists of an input layer followed by stacked convolutional layers for hierarchical input feature extraction, pooling layers for dimensionality reduction, and fully-connected layer for inference. In addition, we introduce residual connections between the stacked convolutional layers to improve the hierarchical feature learning capability of deep SNNs. We propose Spike Timing Dependent Plasticity (STDP) based probabilistic learning algorithm, referred to as Hybrid-STDP (HB-STDP), incorporating Hebbian and anti-Hebbian learning mechanisms, to train the binary kernels forming ReStoCNet in a layer-wise unsupervised manner. We demonstrate the efficacy of ReStoCNet and the presented HB-STDP based unsupervised training methodology on the MNIST and CIFAR-10 datasets. We show that residual connections enable the deeper convolutional layers to self-learn useful high-level input features and mitigate the accuracy loss observed in deep SNNs devoid of residual connections. The proposed ReStoCNet offers &20 × kernel memory compression compared to full-precision (32-bit) SNN while yielding high enough classification accuracy on the chosen pattern recognition tasks.
机译:在这项工作中,我们提出了ReStoCNet,这是一种由二进制内核组成的残差随机多层卷积Spiking神经网络(SNN),可减少突触记忆足迹并提高SNN在复杂模式识别任务中的计算效率。 ReStoCNet包含一个输入层,其后是用于分层输入特征提取的堆叠卷积层,用于降维的池化层以及用于推理的全连接层。另外,我们在堆叠的卷积层之间引入残差连接,以提高深度SNN的分层特征学习能力。我们提出基于Spike时序依赖可塑性(STDP)的概率学习算法,称为混合STDP(HB-STDP),结合了Hebbian和反Hebbian学习机制,以分层无监督的方式训练形成ReStoCNet的二进制内核。我们在MNIST和CIFAR-10数据集上证明了ReStoCNet和基于HB-STDP的无监督训练方法的功效。我们表明残差连接使更深的卷积层能够自学习有用的高级输入功能,并减轻了在没有残差连接的深度SNN中观察到的精度损失。与全精度(32位)SNN相比,提出的ReStoCNet提供了20倍以上的内核内存压缩,同时在所选模式识别任务上产生了足够高的分类精度。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号