首页> 外文OA文献 >In-place Activated BatchNorm for Memory-Optimized Training of DNNs
【2h】

In-place Activated BatchNorm for Memory-Optimized Training of DNNs

机译:用于内存优化的DNN的内存优化培训的原始激活的Batchnorm

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

In this work we present In-Place Activated Batch Normalization (InPlace-ABN)- a novel approach to drastically reduce the training memory footprint ofmodern deep neural networks in a computationally efficient way. Our solutionsubstitutes the conventionally used succession of BatchNorm + Activation layerswith a single plugin layer, hence avoiding invasive framework surgery whileproviding straightforward applicability for existing deep learning frameworks.We obtain memory savings of up to 50% by dropping intermediate results and byrecovering required information during the backward pass through the inversionof stored forward results, with only minor increase (0.8-2%) in computationtime. Also, we demonstrate how frequently used checkpointing approaches can bemade computationally as efficient as InPlace-ABN. In our experiments on imageclassification, we demonstrate on-par results on ImageNet-1k withstate-of-the-art approaches. On the memory-demanding task of semanticsegmentation, we report results for COCO-Stuff, Cityscapes and MapillaryVistas, obtaining new state-of-the-art results on the latter without additionaltraining data but in a single-scale and -model scenario. Code can be found athttps://github.com/mapillary/inplace_abn .
机译:在这项工作中,我们提出了就地激活的批量归一化(INPLASE-ABN) - 以计算有效的方式大大减少现代神经网络的训练内存足迹的新方法。我们的解决方案可以使用单个插件层的常规使用的批量频道+激活图层连续,因此避免了侵入性框架手术,同时为现有的深度学习框架提供了直接适用性。我们通过删除中间结果并在后退期间逐步恢复所需信息,从而获得最多50%的内存节省。通过存储转发结果的反转,仅在计算时仅增加(0.8-2%)。此外,我们展示了使用频率的频率,以便在计算上效率为效率,如AnPlace-ABN。在我们对ImageClassification的实验中,我们在Imagenet-1K上展示了艺术术语方法的标准结果。关于语义开关的记忆苛刻任务,我们向Coco-Stuff,CityScapes和MapillaryVistas报告结果,在后者上获得新的最先进结果,而无需额外的数据,而是单一规模和模型方案。代码可以找到Athttps://github.com/mapillary/inplace_abn。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号