首页> 外文OA文献 >BinaryRelax: A Relaxation Approach for Training Deep Neural Networks with Quantized Weights
【2h】

BinaryRelax: A Relaxation Approach for Training Deep Neural Networks with Quantized Weights

机译:BinaryRelax:具有量化重量的深神经网络的放松方法

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。
获取外文期刊封面目录资料

摘要

We propose BinaryRelax, a simple two-phase algorithm, for training deepneural networks with quantized weights. The set constraint that characterizesthe quantization of weights is not imposed until the late stage of training,and a sequence of pseudo quantized weights is maintained. Specifically, werelax the hard constraint into a continuous regularizer via Moreau envelope,which turns out to be the squared Euclidean distance to the set of quantizedweights. The pseudo quantized weights are obtained by linearly interpolatingbetween the float weights and their quantizations. A continuation strategy isadopted to push the weights towards the quantized state by gradually increasingthe regularization parameter. In the second phase, exact quantization schemewith a small learning rate is invoked to guarantee fully quantized weights. Wetest BinaryRelax on the benchmark CIFAR-10 and CIFAR-100 color image datasetsto demonstrate the superiority of the relaxed quantization approach and theimproved accuracy over the state-of-the-art training methods. Finally, we provethe convergence of BinaryRelax under an approximate orthogonality condition.
机译:我们提出了一种简单的两相算法,用于培训具有量化重量的深度网络的BinaryRelax。在训练后期的阶段和伪量化权重的阶段,不施加特征测量的结构约束。具体而言,通过莫鲁安包络将难度约束成连续规范器,这反过来是与占量子量的平方欧几里德距离。通过线性地插入浮法重量及其量化来获得伪量化的重量。延续的延续策略通过逐步增加正则化参数来将权重推向量化状态。在第二阶段中,调用精确量化方案,以确保完全量化的权重。基准CiFar-10和CiFar-100彩色图像DataseTSO最炙手可热的BinaryRelax展示了放宽量化方法的优越性和最先进的训练方法的优势。最后,我们在近似正交性条件下捕获了Biniarrelax的会聚。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号