首页> 外文会议>International conference on neural information processing >Efficient Learning Algorithm Using Compact Data Representation in Neural Networks
【24h】

Efficient Learning Algorithm Using Compact Data Representation in Neural Networks

机译:神经网络中使用紧凑数据表示的高效学习算法

获取原文

摘要

Convolutional neural networks have dramatically improved the prediction accuracy in a wide range of applications, such as vision recognition and natural language processing. However the recent neural networks often require several hundred megabytes of memory for the network parameters, which in turn consume a large amount of energy during computation. In order to achieve better energy efficiency, this work investigates the effects of compact data representation on memory saving for network parameters in artificial neural networks while maintaining comparable accuracy in both training and inference phases. We have studied the dependence of prediction accuracy on the total number of bits for fixed point data representation, using a proper range for synaptic weights. We have also proposed a dictionary based architecture that utilizes a limited number of floating-point entries for all the synaptic weights, with proper initialization and scaling factors to minimize the approximation error. Our experiments using a 5-layer convolutional neural network on Cifar-10 dataset have shown that 8 bits are enough for bit width reduction and dictionary based architecture to achieve 96.0% and 96.5% relative accuracy respectively, compared to the conventional 32-bit floating point.
机译:卷积神经网络在视觉识别和自然语言处理等广泛应用中极大地提高了预测准确性。但是,最近的神经网络通常需要数百兆字节的存储空间来存储网络参数,这反过来会在计算过程中消耗大量能量。为了获得更好的能源效率,这项工作研究了紧凑数据表示对人工神经网络中网络参数的内存节省的影响,同时在训练和推理阶段均保持了相当的准确性。我们使用适当的突触权重范围,研究了预测精度对定点数据表示的位数总数的依赖性。我们还提出了一种基于字典的体系结构,该结构将有限数量的浮点条目用于所有突触权重,并具有适当的初始化和缩放因子,以最大程度地减小逼近误差。我们在Cifar-10数据集上使用5层卷积神经网络进行的实验表明,与传统的32位浮点相比,8位足以减少位宽和基于字典的体系结构分别达到96.0%和96.5%的相对精度。 。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号