首页> 外国专利> Speeding up training of deep neural networks using inconsistent stochastic gradient descent

Speeding up training of deep neural networks using inconsistent stochastic gradient descent

机译:使用不一致的随机梯度下降加速深层神经网络的训练

摘要

Aspects of the present disclosure describe techniques for training convolutional neural networks using an inconsistent stochastic gradient descent (ISGD) algorithm. The training effort of the training batch used by the ISGD algorithm is dynamically adjusted according to the loss determined for the specific training batch and is subordinate to two sub-states (fully trained or well trained) Not). The ISGD algorithm increases iterations for batches that are not fully trained while reducing iterations for well trained batches.
机译:本公开的各方面描述了用于使用不一致的随机梯度下降(ISGD)算法来训练卷积神经网络的技术。根据为特定培训批次确定的损失,动态调整ISGD算法使用的培训批次的培训工作量,并且该培训工作服从于两个子状态(受过充分训练或受过良好训练的状态)(不是)。 ISGD算法增加了未完全训练的批次的迭代次数,同时减少了训练有素的批次的迭代次数。

著录项

相似文献

  • 专利
  • 外文文献
  • 中文文献
获取专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号