首页> 外文会议>International conference on knowledge science, engineering and management >Symmetric Rectified Linear Units for Fully Connected Deep Models
【24h】

Symmetric Rectified Linear Units for Fully Connected Deep Models

机译:完全连接的深模型的对称整流线性单元

获取原文

摘要

Rectified Linear Units (ReLU) is one of the key aspects for the success of Deep Learning models. It has been shown that deep networks can be trained efficiently using ReLU without pre-training. In this paper, we compare and analyze various kinds of ReLU variants in fully-connected deep neural networks. We test ReLU, LReLU, ELU, SELU, mReLU and vReLU on two popular datasets: MNIST and Fashion-MNIST. We find vReLU, a symmetric ReLU variant, shows promising results in most experiments. Fully-connected networks (FCN) with vReLU activation are able to achieve a higher accuracy. It achieves relative improvement in test error rate of 39.9% compared to ReLU on MNIST dataset; and achieves relative improvement of 6.3% compared to ReLU on Fashion-MNIST dataset.
机译:整流线性单位(ReLU)是深度学习模型成功的关键方面之一。已经表明,可以使用ReLU有效地训练深度网络,而无需进行预训练。在本文中,我们比较和分析了完全连接的深度神经网络中的各种ReLU变体。我们在两个流行的数据集:MNIST和Fashion-MNIST上测试ReLU,LReLU,ELU,SELU,mReLU和vReLU。我们发现vReLU(一种对称的ReLU变体)在大多数实验中显示出令人鼓舞的结果。具有vReLU激活功能的全连接网络(FCN)能够实现更高的准确性。与MNIST数据集上的ReLU相比,它的测试错误率相对提高了39.9%;与Fashion-MNIST数据集上的ReLU相比,相对提升了6.3%。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号