首页> 外文期刊>IEEE transactions on circuits and systems . I , Regular papers >sBSNN: Stochastic-Bits Enabled Binary Spiking Neural Network With On-Chip Learning for Energy Efficient Neuromorphic Computing at the Edge
【24h】

sBSNN: Stochastic-Bits Enabled Binary Spiking Neural Network With On-Chip Learning for Energy Efficient Neuromorphic Computing at the Edge

机译:SBSNN:随机位启用二进制尖刺神经网络,具有片上学习的高效神经形态计算在边缘

获取原文
获取原文并翻译 | 示例
       

摘要

In this work, we propose stochastic Binary Spiking Neural Network (sBSNN) composed of stochastic spiking neurons and binary synapses (stochastic only during training) that computes probabilistically with one-bit precision for power-efficient and memory-compressed neuromorphic computing. We present an energy-efficient implementation of the proposed sBSNN using 'stochastic bit' as the core computational primitive to realize the stochastic neurons and synapses, which are fabricated in 90nm CMOS process, to achieve efficient on-chip training and inference for image recognition tasks. The measured data shows that the 'stochastic bit' can be programmed to mimic spiking neurons, and stochastic Spike Timing Dependent Plasticity (or sSTDP) rule for training the binary synaptic weights without expensive random number generators. Our results indicate that the proposed sBSNN realization offers possibility of up to 32x neuronal and synaptic memory compression compared to full precision (32-bit) SNN and energy efficiency of 89.49 TOPS/Watt for two-layer fully-connected SNN.
机译:在这项工作中,我们提出了由随机尖峰神经元和二元突触组成的随机二进制尖峰神经网络(SBSNN)(仅在训练期间的随机突出),其用一位精度为高功率和内存压缩的神经形态计算计算。我们使用“随机位”作为核心计算原语来实现所提出的SBSNN的节能实现,以实现在90nm CMOS过程中制造的随机神经元和突触,以实现有效的片上培训和对图像识别任务的推断。测量的数据表明,“随机位”可以被编程为模拟尖峰神经元,以及随机尖峰定时依赖性可塑性(或SSTDP)规则,用于训练二元突触权重,没有昂贵的随机数发生器。我们的结果表明,拟议的SBSNN实现提供了高达32倍的神经元和突触记忆压缩的可能性,与全部精密(32位)SNN和能效为89.49顶/瓦的能量效率,适用于两层完全连接的SNN。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号