...
首页> 外文期刊>Neurocomputing >STBNN: Hardware-friendly spatio-temporal binary neural network with high pattern recognition accuracy
【24h】

STBNN: Hardware-friendly spatio-temporal binary neural network with high pattern recognition accuracy

机译:STBNN:硬件友好的时空二元神经网络,具有高模式识别精度

获取原文
获取原文并翻译 | 示例

摘要

In recent years, the weight binarized neural network (BNN) technology has made great progress. However, neural networks with binarized inputs and binarized weights suffer from low accuracy in pattern recognition or inefficiency in hardware implementation. This work proposes a spatio-temporal binary neural network (STBNN) to solve this problem. STBNN has binary network input/output, binary neuron input/output, and binarized weights, and it integrates the computationally expensive batch normalization (BN) operation widely used in previous BNNs into the neuron threshold. STBNN can largely save computing resources and storage space while maintaining high accuracy (e.g., 98.0% on the MNIST test set). Using binary input (0 or 1) and binarized weight (+/- 1), the product of input and weight can be realized by a 1-bit Signed AND operation instead of multiplication operation in hardware implementation, thus significantly reducing computing resources, memory requirements, and power consumption. The results show that compared with a 32-bit multi-layer perceptron (MLP)-based hardware design, the STBNN-based hardware design typically reduces these three indicators by 84.2%, 96.4%, and 96.7%, respectively. This work provides an effective method to construct hardware-friendly neural network models and a guide for designing an extremely hardware-saving neural network processor. (C) 2020 Elsevier B.V. All rights reserved.
机译:近年来,重量二值化神经网络(BNN)技术取得了很大进展。然而,具有二值化输入和二值化权重的神经网络在模式识别或硬件实现中的低效率下遭受了低精度。这项工作提出了一种解决这个问题的时空二元神经网络(STBNN)。 STBNN具有二进制网络输入/输出,二进制神经元输入/输出和二值化权重,它集成了在先前BNN中广泛使用的计算昂贵的批量归一化(BN)操作进入神经元阈值。 STBNN可以在很大程度上节省计算资源和存储空间,同时保持高精度(例如,在MNIST测试集上的98.0%)。使用二进制输入(0或1)和二值化重量(+/- 1),输入和重量的乘积可以通过硬件实现中的1位符号和操作而不是乘法操作来实现,从而显着减少计算资源,内存要求和功耗。结果表明,与32位多层Perceptron(MLP)的硬件设计相比,基于STBNN的硬件设计通常将这三个指标减少84.2%,96.4%和96.7%。这项工作提供了一种构建硬件友好的神经网络模型的有效方法和用于设计极其硬件节省的神经网络处理器的指南。 (c)2020 Elsevier B.v.保留所有权利。

著录项

  • 来源
    《Neurocomputing 》 |2020年第7期| 351-360| 共10页
  • 作者单位

    Univ Elect Sci & Technol China State Key Lab Elect Thin Films & Integrated Devic Chengdu 610054 Peoples R China;

    Univ Elect Sci & Technol China State Key Lab Elect Thin Films & Integrated Devic Chengdu 610054 Peoples R China|Univ Elect Sci & Technol China Brain Inspired Integrated Chip & Syst Res Ctr Chengdu 610054 Peoples R China;

    Nanyang Technol Univ Sch Elect & Elect Engn Singapore 639798 Singapore;

    Univ Elect Sci & Technol China State Key Lab Elect Thin Films & Integrated Devic Chengdu 610054 Peoples R China;

    Univ Elect Sci & Technol China State Key Lab Elect Thin Films & Integrated Devic Chengdu 610054 Peoples R China;

    Univ Elect Sci & Technol China State Key Lab Elect Thin Films & Integrated Devic Chengdu 610054 Peoples R China;

    Univ Elect Sci & Technol China State Key Lab Elect Thin Films & Integrated Devic Chengdu 610054 Peoples R China|Univ Elect Sci & Technol China Brain Inspired Integrated Chip & Syst Res Ctr Chengdu 610054 Peoples R China;

  • 收录信息 美国《科学引文索引》(SCI);美国《工程索引》(EI);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Binary neural networks; Hardware-friendly; Spatio-temporal coding; Spiking neural networks;

    机译:二元神经网络;硬件友好;时空编码;尖峰神经网络;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号