首页>
外国专利>
Speeding up training of deep neural networks using inconsistent stochastic gradient descent
Speeding up training of deep neural networks using inconsistent stochastic gradient descent
展开▼
机译:使用不一致的随机梯度下降加速深层神经网络的训练
展开▼
页面导航
摘要
著录项
相似文献
摘要
Aspects of the present disclosure describe techniques for training convolutional neural networks using an inconsistent stochastic gradient descent (ISGD) algorithm. The training effort of the training batch used by the ISGD algorithm is dynamically adjusted according to the loss determined for the specific training batch and is subordinate to two sub-states (fully trained or well trained) Not). The ISGD algorithm increases iterations for batches that are not fully trained while reducing iterations for well trained batches.
展开▼