...
首页> 外文期刊>Machine Learning and Knowledge Extraction >Robust Learning with Implicit Residual Networks
【24h】

Robust Learning with Implicit Residual Networks

机译:具有隐式残差网络的强大学习

获取原文

摘要

In this effort, we propose a new deep architecture utilizing residual blocks inspired by implicit discretization schemes. As opposed to the standard feed-forward networks, the outputs of the proposed implicit residual blocks are defined as the fixed points of the appropriately chosen nonlinear transformations. We show that this choice leads to the improved stability of both forward and backward propagations, has a favorable impact on the generalization power, and allows for control the robustness of the network with only a few hyperparameters. In addition, the proposed reformulation of ResNet does not introduce new parameters and can potentially lead to a reduction in the number of required layers due to improved forward stability. Finally, we derive the memory-efficient training algorithm, propose a stochastic regularization technique, and provide numerical results in support of our findings.
机译:在这项工作中,我们提出了一种利用由隐式离散化方案启发的剩余块的新型深度建筑。 As opposed to the standard feed-forward networks, the outputs of the proposed implicit residual blocks are defined as the fixed points of the appropriately chosen nonlinear transformations. 我们表明,该选择导致前向和向后传播的提高稳定性,对泛化功率有良好的影响,并且允许仅使用少数超公数来控制网络的鲁棒性。 此外,拟议的Reset拟议重构不会引入新参数,并且由于提高前向稳定性,可能导致所需层数的减少。 最后,我们推出了内存高效的训练算法,提出了一种随机正则化技术,并提供了对我们的研究结果支持的数值结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号