首页> 中文期刊> 《计算机工程与设计 》 >基于卷积神经网络的随机梯度下降算法

基于卷积神经网络的随机梯度下降算法

             

摘要

To solve the problem of the effect of the improper learning rate setting problem of the stochastic gradient descent algo rithm (SGD) in the convolutional neural network (CNN),an algorithm for learning rate adaptive SGD was proposed.With the ite-ration,the learning rate was changed periodically.Aiming at the defect that the Relu activation function discards the neurons of negative threshold in CNN,the CNN of selecting Leaky Relu as an activation function was designed.Numbers of experiments were presented to verify the feasibility of using Leaky Relu activation function.Experiments results show that the perfor-mance of SGD which uses the above algorithm can not only accelerate the convergence of the networks,but improve the learning accuracy of the networks.By using Leaky Relu and the proposed SGD,further improvement is made on the learning accuracy of the CNN.%为解决卷积神经网络(CNN)中随机梯度下降算法(SGD)的学习率设置不当对SGD算法的影响,提出一种学习率自适应SGD的更新算法,随着迭代的进行该算法使学习率呈现周期性的改变.针对CNN中Relu激活函数将CNN中的阈值为负的神经元丢弃的缺陷,设计选择Leaky Relu作为激活函数的CNN.实验验证了使用该激活函数的有效性,实验结果表明,采用上述学习率更新算法的SGD可以使网络快速收敛,提高了学习正确率;通过将Leaky Relu激活函数和采用上述学习率更新算法的SGD相结合,进一步提高CNN的学习正确率.

著录项

相似文献

  • 中文文献
  • 外文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号