首页> 外文会议>Chinese conference on pattern recognition and computer vision >Training Low Bitwidth Model with Weight Normalization for Convolutional Neural Networks
【24h】

Training Low Bitwidth Model with Weight Normalization for Convolutional Neural Networks

机译:培训低位宽度模型,重量标准化卷积神经网络

获取原文

摘要

Convolutional Neural Networks (CNNs) is now widely utilized in computer vision applications, including image classification, object detection and segmentation. However, high memory complexity and computation intensive have limited the deployment on low power embedded devices. We propose a method to train convolutional neural networks with low bitwidth by performing weight normalization. By normalization, the distribution of the weight can be narrowed, which enables the low bitwidth network to achieve a good trade-off between range and precision. Moreover, adding a scaling factor to the weight solves the problem of inadequate expressiveness at low bits, which further improves the performance of classification. The experiments on various datasets show that our method can achieve comparable prediction accuracy as that of full-precision models. To emphasize, the proposed scheme can quantize the network of AlexNet to 3-bit fixed point on ImageNet, and the accuracy of top-1 drop only by 1%.
机译:卷积神经网络(CNNS)现在广泛用于计算机视觉应用中,包括图像分类,对象检测和分割。但是,高内存复杂性和计算密集型限制了低功耗嵌入式设备上的部署。我们提出了一种通过执行重量标准化来培训具有低位宽度的卷积神经网络的方法。通过归一化,可以缩小重量的分布,这使得低位宽网络能够在范围和精度之间实现良好的折衷。此外,向重量添加缩放因子解决了低位以低位的表达不足的问题,这进一步提高了分类的性能。各种数据集的实验表明,我们的方法可以实现与全精密型号相当的预测精度。为了强调,所提出的方案可以量化AlexNet网络到Imagenet上的3位固定点,并且顶级1的精度仅为1%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号