带有百万个参数的神经网络在大量训练集的训练下,很容易产生过拟合现象。一些正则化方法被学者提出以期达到对参数的约束求解。本文总结了深度学习中的L1,L2和Dropout正则化方法。最后基于上述正则化方法,进行了MNIST手写体识别对比数值试验。 The neural network with millions of parameters can easily be overfitting by large dataset. A wide range of regularization methods have been proposed. In this paper, L1, L2 and Dropout regularization methods are reviewed. Finally, MNIST handwriting recognition experiments using the above regularization methods are conducted for comparisons.
展开▼
机译:带有百万个参数的神经网络在大量训练集的训练下,很容易产生过拟合现象。一些正则化方法被学者提出以期达到对参数的约束求解。本文总结了深度学习中的L1,L2和Dropout正则化方法。最后基于上述正则化方法,进行了MNIST手写体识别对比数值试验。 The neural network with millions of parameters can easily be overfitting by large dataset. A wide range of regularization methods have been proposed. In this paper, L1, L2 and Dropout regularization methods are reviewed. Finally, MNIST handwriting recognition experiments using the above regularization methods are conducted for comparisons.
展开▼