首页> 外文会议>International Conference on Neural Information Processing >Efficient Learning Algorithm Using Compact Data Representation in Neural Networks
【24h】

Efficient Learning Algorithm Using Compact Data Representation in Neural Networks

机译:高效学习算法在神经网络中使用紧凑型数据表示

获取原文

摘要

Convolutional neural networks have dramatically improved the prediction accuracy in a wide range of applications, such as vision recognition and natural language processing. However the recent neural networks often require several hundred megabytes of memory for the network parameters, which in turn consume a large amount of energy during computation. In order to achieve better energy efficiency, this work investigates the effects of compact data representation on memory saving for network parameters in artificial neural networks while maintaining comparable accuracy in both training and inference phases. We have studied the dependence of prediction accuracy on the total number of bits for fixed point data representation, using a proper range for synaptic weights. We have also proposed a dictionary based architecture that utilizes a limited number of floating-point entries for all the synaptic weights, with proper initialization and scaling factors to minimize the approximation error. Our experiments using a 5-layer convolutional neural network on Cifar-10 dataset have shown that 8 bits are enough for bit width reduction and dictionary based architecture to achieve 96.0% and 96.5% relative accuracy respectively, compared to the conventional 32-bit floating point.
机译:卷积神经网络在广泛的应用中大大提高了预测准确性,例如视觉识别和自然语言处理。然而,最近的神经网络通常需要几百兆字节的网络参数内存,这反过来在计算期间消耗大量的能量。为了实现更好的能量效率,这项工作调查了紧凑数据表示对人工神经网络中网络参数的影响,同时在训练和推理阶段保持了可比的准确性。我们已经研究了预测精度对定点数据表示的总比特总数的依赖性,使用适当的突触权重的范围。我们还提出了一种基于字典的架构,其利用所有突触权重的有限数量的浮点条目,具有正确的初始化和缩放因子,以最小化近似误差。我们在CIFAR-10数据集上使用5层卷积神经网络的实验表明,与传统的32位浮点相比,8位足以实现比特宽度减小和基于字典的架构,以达到96.0%和96.5%的相对精度。 。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号